Papers
Topics
Authors
Recent
Search
2000 character limit reached

Moment–SOS Relaxation Method

Updated 18 January 2026
  • Moment–SOS relaxation is a convex algebraic-geometric method that enables global optimization of general polynomial functions over semi-algebraic sets using a hierarchy of SDP relaxations.
  • It constructs moment and localizing matrices that ensure asymptotic and finite convergence under optimality conditions such as LICQ, SCC, and SOSC.
  • The framework extends to matrix optimization, sparse hierarchies, and noncompact domains via homogenization and denominator-SOS techniques to certify global minima.

Moment–SOS Relaxation

The Moment–Sum-of-Squares (Moment–SOS) relaxation is a convex algebraic-geometric methodology that enables global optimization of general polynomial functions over semi-algebraic sets, as well as dual formulations for generalized moment problems. It systematically extends to settings with equality and inequality constraints for both scalar and matrix-valued polynomials, operates with non-real radical constraint ideals, and is applicable to compact and certain noncompact domains via homogenization or denominator reformulations. The moment–SOS hierarchy constructs a sequence of semidefinite programming (SDP) relaxations of increasing order, with theoretical guarantees on asymptotic and, under sufficient conditions, finite convergence to the exact global optimum or minimum of the associated optimization problem.

1. Algebraic Framework and Duality

Given a set of real polynomials in nn variables,

R[x]n=R[x1,,xn],R[x]_n = \mathbb{R}[x_1, \dots, x_n],

and a closed semialgebraic set defined as

K={xRn:cj(x)=0 for jE,  cj(x)0 for jI},K = \left\{ x \in \mathbb{R}^n : c_j(x) = 0 \text{ for } j \in E, \; c_j(x) \ge 0 \text{ for } j \in I \right\},

the moment–SOS method is built on the following pair of dual conic optimization problems (Huang et al., 2023):

  • Primal (Conic) Form:

maxθRmbTθsubject tof(x)i=1mθiai(x)Pd(K),θi0  for i>m1\max_{\theta \in \mathbb{R}^m} b^T\theta \quad \text{subject to}\quad f(x) - \sum_{i=1}^m \theta_i a_i(x) \in P_d(K),\,\, \theta_i \ge 0\;\text{for }i > m_1

where Pd(K)P_d(K) is the cone of degree-dd polynomials nonnegative on KK.

  • Dual (Generalized Moment) Form:

minyRNd(f,y)subject to(ai,y)=bi(im1),(ai,y)bi(i>m1),yRd(K)\min_{y \in \mathbb{R}^{|N_d|}} (f,y) \quad \text{subject to}\quad (a_i, y) = b_i\,\, (i \le m_1),\,\, (a_i, y) \ge b_i\,\, (i>m_1),\,\, y \in R_d(K)

where Rd(K)R_d(K) is the cone of truncated moment sequences admitting a measure supported on KK.

This structure generalizes classical global polynomial optimization and the generalized moment problem, and supports study in settings including noncommutative algebra and polynomial matrix inequalities (Guo et al., 2023, Huang et al., 2024).

2. Hierarchical Relaxations: Moment and SOS Formulations

The central computational objects of the moment–SOS method are:

  • Moment Matrices: For a degree-2kk truncated multisequence y=(yα)α2ky = (y_\alpha)_{|\alpha| \leq 2k}, the kk-th moment matrix,

(Mk[y])α,β=yα+β,(M_k[y])_{\alpha, \beta} = y_{\alpha+\beta},

indexed by monomials up to degree kk.

  • Localizing Matrices: For polynomials qq of degree up to 22\ell, the kk-th localizing matrix,

Lk[q][y]α,β=γqγyα+β+γ,α,βkdeg(q)/2.L_k[q][y]_{\alpha, \beta} = \sum_\gamma q_\gamma\, y_{\alpha + \beta + \gamma}, \quad |\alpha|, |\beta| \leq k - \deg(q)/2.

The Moment–SOS hierarchy at order kk constructs (Huang et al., 2023):

  • Primal Moment Relaxation (order kk):

minw(f,w)\min_w (f, w)

subject to linear constraints and

Lkdeg(cj)/2[cj][w]=0(jE),Lkdeg(cj)/2[cj][w]0(jI),Mk[w]0.L_{k-\lceil \deg(c_j)/2 \rceil}[c_j][w] = 0\,\, (j \in E),\quad L_{k-\lceil \deg(c_j)/2 \rceil}[c_j][w] \succeq 0\,\, (j \in I),\quad M_k[w] \succeq 0.

  • Dual SOS Relaxation (order kk):

maxθbTθ\max_\theta b^T\theta

subject to

fi=1mθiaiIdeal[cj:jE]2k+QM[cj:jI]2k,θi0(i>m1)f - \sum_{i=1}^m \theta_i a_i \in \mathrm{Ideal}[c_j: j \in E]_{2k} + \mathrm{QM}[c_j: j \in I]_{2k},\quad \theta_i \geq 0\,\, (i>m_1)

where SOS polynomials and the quadratic module QM[]\mathrm{QM}[\cdot] are employed.

Asymptotic convergence: Provided KK is compact and the quadratic module QQ is archimedean, the relaxation sequence is monotone with vkvv_k \to v^*, θkθ\theta_k \to \theta^*.

A flat extension rank condition,

rankMkd[w]=rankMk[w],\mathrm{rank}\, M_{k-d}[w] = \mathrm{rank}\, M_{k}[w],

permits extraction of atomic measures realizing the exact optimizer (Huang et al., 2023, Guo et al., 2023).

3. Finite Convergence and Optimality Conditions

While asymptotic convergence is guaranteed under archimedeanness, finite convergence of the Moment–SOS hierarchy requires classical nonlinear optimization conditions at all global minimizers of the underlying minimization:

  • Linear Independence Constraint Qualification (LICQ): Gradients of equality constraints are linearly independent.
  • Strict Complementarity Condition (SCC): Lagrange multipliers for active constraints are strictly positive.
  • Second Order Sufficiency Condition (SOSC): Lagrangian Hessian is positive definite on the critical subspace.

Under these assumptions, finite convergence is achieved: there exists k0k_0 such that for all kk0k \geq k_0,

vk=θk=v=θ,v_k = \theta_k = v^* = \theta^*,

and all optimal moment sequences at level kk satisfy the flat extension rank criterion [(Huang et al., 2023), Theorem 3.2]. This result holds even when the constraint ideal is not real radical. Illustrative cases demonstrating finite convergence with non-real radical ideals include nonconvex sphere constraints and degeneracies where only SCC/SOSC are satisfied [(Huang et al., 2023), Example 3.4, 3.5].

4. Homogenization and Denominator Hierarchies for Unbounded Domains

For noncompact KK, the hierarchy may not exhibit finite convergence. The method introduces a homogenization variable x00x_0 \geq 0, replacing p(x)p(x) with its homogenization p(x0,x)p^\wedge(x_0, x) and imposing x02+x2=1x_0^2 + \|x\|^2 = 1 as an additional constraint. The feasible set becomes compact, and an equivalent moment–SOS hierarchy applies, with the same finite convergence guarantees under optimality conditions on the homogenized problem [(Huang et al., 2023), Theorem 4.3]. This solves key open conjectures in the literature by removing the need for a real radical constraint ideal (Huang et al., 2023).

In addition, the denominator-SOS hierarchy uses certificates of the form

(1+x2)(fγ)Ideal[cj]2k+QM[cj]2k(1 + \|x\|^2)^\ell(f - \gamma) \in \mathrm{Ideal}[c_j]_{2k} + \mathrm{QM}[c_j]_{2k}

for some exponent \ell, with convergence (and finite convergence under classical optimality conditions) similarly assured [(Huang et al., 2023), Theorem 4.7].

5. Generalizations: Polynomial Matrix and Sparse Hierarchies

The framework readily extends to:

  • Polynomial Matrix Inequality (PMI) Optimization: Optimization over xRnx \in \mathbb{R}^n subject to G(x)0G(x) \succeq 0 for a symmetric polynomial matrix G(x)G(x). The corresponding matrix moment-SOS hierarchy replaces scalar moment and localizing matrices by block versions and uses quadratic modules of the form QM[G]\mathrm{QM}[G]; finite convergence is guaranteed under archimedeanness and NDC/SCC/SOSC at all minimizers (Huang et al., 2024, Guo et al., 2023).
  • Sparse and Block-Decomposed Hierarchies: When the problem admits correlative or term sparsity, as formalized by the running intersection property and term-sparsity patterns, relaxations assemble from block or clique-structured moment matrices. Finite convergence and “tightness” are characterized in terms of decomposability of the SOS certificate across blocks (i.e., the ability to write ffminf-f_{\min} as a sum of block-local elements of the ideal plus quadratic module) (Nie et al., 2024, Wang et al., 2019, Wang et al., 2020).

6. Applications and Examples

The Moment–SOS framework applies to a broad class of polynomial optimization and generalized moment problems:

Other domains include robust and semi-infinite polynomial matrix optimization, stochastic program approximations, and optimal power flow (Guo et al., 2023, Josz et al., 2013).

7. Algorithmic and Computational Considerations

  • The core relaxation at order kk leads to semidefinite programs with moment and localizing matrices whose sizes grow with the number of variables nn and the relaxation order.
  • Block and sparse relaxations dramatically reduce computational cost for large, structured problems (Wang et al., 2019, Wang et al., 2020, Nie et al., 2024).
  • Flat extension detection enables algebraic extraction of finitely atomic measures corresponding to global minimizers (Huang et al., 2023, Guo et al., 2023).
  • Large-scale moment–SOS relaxations exploiting polyhedral–SDP structure can be solved efficiently via low-rank augmented Lagrangian methods (Hou et al., 6 Dec 2025).
  • For problems with degenerate constraint ideals (positive-dimensional varieties), branch-and-bound or filtering techniques in combination with moment–SOS lower bounds provide practical approaches for approximate global optimization (Mohammadi et al., 2016, Ðurašinović et al., 24 Jan 2025).

The Moment–SOS relaxation paradigm offers a unified, algebraically grounded hierarchy for nonconvex polynomial and moment-constrained optimization, with deep connections to algebraic geometry, numerical analysis, and semidefinite programming. Recent work resolves major conjectures by establishing that finite convergence can be certified under optimality conditions without recourse to real radicality—leading to both theoretical completeness and robust applicability across diverse domains (Huang et al., 2023).

Definition Search Book Streamline Icon: https://streamlinehq.com
References (17)

Topic to Video (Beta)

No one has generated a video about this topic yet.

Whiteboard

No one has generated a whiteboard explanation for this topic yet.

Follow Topic

Get notified by email when new papers are published related to Moment–SOS Relaxation.