Moment-Sums of Squares Hierarchy
- The moment-sums of squares hierarchy is a systematic framework that approximates global solutions of polynomial optimization problems via dual SDP relaxations.
- It leverages effective Positivstellensatz bounds to provide explicit convergence rates and guarantees for pseudo-moment extraction and minimizer recovery.
- The approach extends to density-based upper-bound relaxations for convex problems, offering practical minimizer extraction with error bounds like O(1/sqrt(r)).
The moment-sums of squares (moment-SoS) hierarchy is a systematic approach for approximating global solutions to polynomial optimization problems (POPs) via convex semidefinite programming. It constructs dual sequences of relaxationsâone based on moment constraints and the other on algebraic sum-of-squares (SoS) certificatesâwhose optima converge to the true minimum, and whose optimal solutions can be interpreted as pseudo-moment functionals or probability measures on the minimizer set. The convergence rate of the hierarchy, the approximation quality of extracted minimizers, and the structure of the limit measures have been the subject of deep recent advances in quantitative real algebraic geometry and convex optimization.
1. Polynomial Optimization and the Moment-SoS Hierarchy
Consider the basic POP: where is a polynomial of degree at most , and is a compact basic semialgebraic set, expressible as .
The moment-SoS hierarchy is built via two dual sequences of semidefinite programs (SDPs):
- SoS tightenings: For each order , compute
where is the -truncated quadratic module generated by the constraints, i.e., sums of squares polynomials with degree bounds.
- Moment relaxations: The dual problem is
This is parametrized by pseudo-moments for , with the moment matrix and localizing matrices for each constraint.
Under the Archimedean condition (existence of a global polynomial constraint bounding ), one has monotonic convergence: (Schlosser, 25 Feb 2025)
2. Effective Positivstellensatz and Convergence Rate
A central analytic tool is an explicit degree bound for Positivstellensatz representations. BaldiâMourrain's effective Putinar theorem states: if with quadratic module constraints and is positive on , then
for explicit constants and . As a corollary, the SoS relaxation satisfies
This provides a polynomial convergence rate for the value gap as , which improves upon prior exponential or non-quantitative bounds for general POP instances (Schlosser, 25 Feb 2025).
3. Weak and Quantitative Moment Measure Convergence
Any sequence of optimal solutions to the r-th moment relaxation yields (by Tchakaloff's theorem and functional analysis) a sequence of measures supported on whose moments up to degree $2r$ match . As , every weak-* limit point of corresponds to a probability measure supported on the optimal level set (Schlosser, 25 Feb 2025).
The recent quantitative refinement establishes an rate for the convergence of the low-order moments: with explicit dependence of on the problem data, and where is a probability measure supported on (Schlosser, 25 Feb 2025). Thus, for fixed truncation of moments (polynomials of bounded degree), the pseudo-moments produced by the SDP approach the true moments of the minimizer measure at a known quantitative rate.
4. Minimizer Extraction and Linear Estimator Guarantee
In the case where the global minimizer is unique (), the above translates to convergence of the first moments: where (Schlosser, 25 Feb 2025). This provides an explicit error bound for the estimated minimizer constructed from the pseudo-moment vector output by the moment SDP. The constant depends on the Ćojasiewicz parameters of near .
This linear minimizer-extractor is efficient, dimension-explicit, and carries a convergence guarantee stemming from the underlying effective Positivstellensatz.
5. Upper-bound (Density-based) SoS Hierarchy and Convex Optimization
A parallel approach for obtaining upper bounds is the -th order "density" SoS relaxation: For convex and full-dimensional convex , De KlerkâLaurentâSun (2017) show
and for any -optimal ,
where . If the minimizer is unique, . Convexity is leveraged via Jensen-like inequalities for and further improves extraction of minimizers (Schlosser, 25 Feb 2025).
6. Broad Applicability and Duality Structure
These convergence phenomena are direct consequences of the duality between the moment closure (positive semi-definite moment and localizing matrices on truncated multiindices) and putative SoS certificates of non-negativity for polynomials constrained to . They hold under the Archimedean compactness condition, and transfer to a wide range of generalizations: matrix-valued PMIs, nonlinear SDPs, infinite-dimensional linear programs over occupation measures for control and PDEs, volume computation, and more, with modifications to the moment and SoS structures appropriate to the domain (Tyburec et al., 2020, Tacchi, 2020, Lasserre, 2018).
These guarantees are strictly quantitative when the POP admits an effective Putinar representation with determined constants, and are robust to solution multiplicity (support over ). In the unique minimizer regime, the SoS/moment SDP hierarchy provides an explicit convergence rate for both the value and the computed minimizer. For problems where or are convex, the density-based SoS hierarchy provides particularly fast convergence in .
Key reference:
- "Convergence rate for linear minimizer-estimators in the moment-sum-of-squares hierarchy" (Schlosser, 25 Feb 2025)