Moment–SOS Relaxation Method
- Moment–SOS relaxation is a convex algebraic-geometric method that enables global optimization of general polynomial functions over semi-algebraic sets using a hierarchy of SDP relaxations.
- It constructs moment and localizing matrices that ensure asymptotic and finite convergence under optimality conditions such as LICQ, SCC, and SOSC.
- The framework extends to matrix optimization, sparse hierarchies, and noncompact domains via homogenization and denominator-SOS techniques to certify global minima.
Moment–SOS Relaxation
The Moment–Sum-of-Squares (Moment–SOS) relaxation is a convex algebraic-geometric methodology that enables global optimization of general polynomial functions over semi-algebraic sets, as well as dual formulations for generalized moment problems. It systematically extends to settings with equality and inequality constraints for both scalar and matrix-valued polynomials, operates with non-real radical constraint ideals, and is applicable to compact and certain noncompact domains via homogenization or denominator reformulations. The moment–SOS hierarchy constructs a sequence of semidefinite programming (SDP) relaxations of increasing order, with theoretical guarantees on asymptotic and, under sufficient conditions, finite convergence to the exact global optimum or minimum of the associated optimization problem.
1. Algebraic Framework and Duality
Given a set of real polynomials in variables,
and a closed semialgebraic set defined as
the moment–SOS method is built on the following pair of dual conic optimization problems (Huang et al., 2023):
- Primal (Conic) Form:
where is the cone of degree- polynomials nonnegative on .
- Dual (Generalized Moment) Form:
where is the cone of truncated moment sequences admitting a measure supported on .
This structure generalizes classical global polynomial optimization and the generalized moment problem, and supports study in settings including noncommutative algebra and polynomial matrix inequalities (Guo et al., 2023, Huang et al., 2024).
2. Hierarchical Relaxations: Moment and SOS Formulations
The central computational objects of the moment–SOS method are:
- Moment Matrices: For a degree-2 truncated multisequence , the -th moment matrix,
indexed by monomials up to degree .
- Localizing Matrices: For polynomials of degree up to , the -th localizing matrix,
The Moment–SOS hierarchy at order constructs (Huang et al., 2023):
- Primal Moment Relaxation (order ):
subject to linear constraints and
- Dual SOS Relaxation (order ):
subject to
where SOS polynomials and the quadratic module are employed.
Asymptotic convergence: Provided is compact and the quadratic module is archimedean, the relaxation sequence is monotone with , .
A flat extension rank condition,
permits extraction of atomic measures realizing the exact optimizer (Huang et al., 2023, Guo et al., 2023).
3. Finite Convergence and Optimality Conditions
While asymptotic convergence is guaranteed under archimedeanness, finite convergence of the Moment–SOS hierarchy requires classical nonlinear optimization conditions at all global minimizers of the underlying minimization:
- Linear Independence Constraint Qualification (LICQ): Gradients of equality constraints are linearly independent.
- Strict Complementarity Condition (SCC): Lagrange multipliers for active constraints are strictly positive.
- Second Order Sufficiency Condition (SOSC): Lagrangian Hessian is positive definite on the critical subspace.
Under these assumptions, finite convergence is achieved: there exists such that for all ,
and all optimal moment sequences at level satisfy the flat extension rank criterion [(Huang et al., 2023), Theorem 3.2]. This result holds even when the constraint ideal is not real radical. Illustrative cases demonstrating finite convergence with non-real radical ideals include nonconvex sphere constraints and degeneracies where only SCC/SOSC are satisfied [(Huang et al., 2023), Example 3.4, 3.5].
4. Homogenization and Denominator Hierarchies for Unbounded Domains
For noncompact , the hierarchy may not exhibit finite convergence. The method introduces a homogenization variable , replacing with its homogenization and imposing as an additional constraint. The feasible set becomes compact, and an equivalent moment–SOS hierarchy applies, with the same finite convergence guarantees under optimality conditions on the homogenized problem [(Huang et al., 2023), Theorem 4.3]. This solves key open conjectures in the literature by removing the need for a real radical constraint ideal (Huang et al., 2023).
In addition, the denominator-SOS hierarchy uses certificates of the form
for some exponent , with convergence (and finite convergence under classical optimality conditions) similarly assured [(Huang et al., 2023), Theorem 4.7].
5. Generalizations: Polynomial Matrix and Sparse Hierarchies
The framework readily extends to:
- Polynomial Matrix Inequality (PMI) Optimization: Optimization over subject to for a symmetric polynomial matrix . The corresponding matrix moment-SOS hierarchy replaces scalar moment and localizing matrices by block versions and uses quadratic modules of the form ; finite convergence is guaranteed under archimedeanness and NDC/SCC/SOSC at all minimizers (Huang et al., 2024, Guo et al., 2023).
- Sparse and Block-Decomposed Hierarchies: When the problem admits correlative or term sparsity, as formalized by the running intersection property and term-sparsity patterns, relaxations assemble from block or clique-structured moment matrices. Finite convergence and “tightness” are characterized in terms of decomposability of the SOS certificate across blocks (i.e., the ability to write as a sum of block-local elements of the ideal plus quadratic module) (Nie et al., 2024, Wang et al., 2019, Wang et al., 2020).
6. Applications and Examples
The Moment–SOS framework applies to a broad class of polynomial optimization and generalized moment problems:
- Polynomial optimization on products of spheres and tensor approximation: For multihomogeneous optimization over products of spheres, finite convergence generically holds, with connections to best rank-one tensor approximation (Halaseh et al., 11 Dec 2025).
- Polynomial differential equations and PDE-constrained control: POPs arising from discretization or occupation measure reformulations of PDEs can be solved via sparse moment–SOS relaxations (Mevissen et al., 2010, Lebarbé et al., 2024, Chhatoi et al., 2024). For noncompact control sets, partial homogenization achieves compactness and convergence without a relaxation gap (Sehnalová et al., 17 Mar 2025).
- Generalized moment problems: The method includes effective convergence rates for geometry-dependent problems such as moment-constrained minimal rank tensor decomposition (Gamertsfelder et al., 16 Jan 2025, Huang et al., 2024).
Other domains include robust and semi-infinite polynomial matrix optimization, stochastic program approximations, and optimal power flow (Guo et al., 2023, Josz et al., 2013).
7. Algorithmic and Computational Considerations
- The core relaxation at order leads to semidefinite programs with moment and localizing matrices whose sizes grow with the number of variables and the relaxation order.
- Block and sparse relaxations dramatically reduce computational cost for large, structured problems (Wang et al., 2019, Wang et al., 2020, Nie et al., 2024).
- Flat extension detection enables algebraic extraction of finitely atomic measures corresponding to global minimizers (Huang et al., 2023, Guo et al., 2023).
- Large-scale moment–SOS relaxations exploiting polyhedral–SDP structure can be solved efficiently via low-rank augmented Lagrangian methods (Hou et al., 6 Dec 2025).
- For problems with degenerate constraint ideals (positive-dimensional varieties), branch-and-bound or filtering techniques in combination with moment–SOS lower bounds provide practical approaches for approximate global optimization (Mohammadi et al., 2016, Ðurašinović et al., 24 Jan 2025).
The Moment–SOS relaxation paradigm offers a unified, algebraically grounded hierarchy for nonconvex polynomial and moment-constrained optimization, with deep connections to algebraic geometry, numerical analysis, and semidefinite programming. Recent work resolves major conjectures by establishing that finite convergence can be certified under optimality conditions without recourse to real radicality—leading to both theoretical completeness and robust applicability across diverse domains (Huang et al., 2023).