Constructing Tight Quadratic Relaxations for Global Optimization: I. Outer-Approximating Twice-Differentiable Convex Functions

Constructing Tight Quadratic Relaxations for Global Optimization: I. Outer-Approximating Twice-Differentiable Convex Functions
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

When computing bounds, spatial branch-and-bound algorithms often linearly outer approximate convex relaxations for non-convex expressions in order to capitalize on the efficiency and robustness of linear programming solvers. Considering that linear outer approximations sacrifice accuracy when approximating highly nonlinear functions and recognizing the recent advancements in the efficiency and robustness of available methods to solve optimization problems with quadratic objectives and constraints, we contemplate here the construction of quadratic outer approximations of twice-differentiable convex functions for use in deterministic global optimization. To this end, we present a novel cutting-plane algorithm that determines the tightest scaling parameter, $α$, in the second-order Taylor series approximation quadratic underestimator proposed by Su et al. We use a representative set of convex functions extracted from optimization benchmark libraries to showcase–qualitatively and quantitatively–the tightness of the constructed quadratic underestimators and to demonstrate the overall computational efficiency of our algorithm. Furthermore, we extend our construction procedure to generate even tighter quadratic underestimators by allowing overestimation in infeasible polyhedral regions of optimization problems, as informed by the latter’s linear constraints.


💡 Research Summary

This paper addresses a fundamental limitation of deterministic global optimization (DGO) algorithms that rely on linear outer approximations of convex relaxations: the loss of tightness when the underlying functions are highly nonlinear. Leveraging recent advances in robust quadratic programming solvers, the authors propose a systematic method to construct the tightest possible quadratic under‑estimators for any twice‑differentiable convex function.

The starting point is the second‑order Taylor expansion at a construction point (x_{0}): \


Comments & Academic Discussion

Loading comments...

Leave a Comment