Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 88 tok/s
Gemini 2.5 Pro 47 tok/s Pro
GPT-5 Medium 21 tok/s Pro
GPT-5 High 13 tok/s Pro
GPT-4o 81 tok/s Pro
Kimi K2 175 tok/s Pro
GPT OSS 120B 450 tok/s Pro
Claude Sonnet 4 39 tok/s Pro
2000 character limit reached

Nash-Smooth Hyperbolicity Cone

Updated 28 September 2025
  • Nash-Smooth Hyperbolicity Cone is a convex set defined by a hyperbolic polynomial with boundaries characterized by Nash functions and strict positive curvature.
  • It admits a lifted linear matrix inequality representation with 2x2 blocks, making it second-order cone representable and computationally efficient.
  • Its structure supports derivative relaxations and central paths, ensuring robust analytic properties and effective optimization in advanced conic frameworks.

A Nash-Smooth Hyperbolicity Cone is a convex cone arising from a hyperbolic polynomial and characterized by boundary regularity defined via Nash functions—analytic functions whose graphs are semialgebraic—combined with strict positive curvature at each boundary point. This structure synthesizes several advanced strands in convex algebraic geometry and optimization, including the theory of hyperbolic polynomials, spectrahedral and second-order cone representation, central paths and their generalizations, amenability of cones, and the role of boundary smoothness in efficient optimization paradigms.

1. Hyperbolic Polynomials and Hyperbolicity Cones

Hyperbolicity cones are defined via homogeneous real polynomials fR[x1,,xn]f \in \mathbb{R}[x_1, \dots, x_n] that are hyperbolic with respect to a direction eRne \in \mathbb{R}^n, meaning that for every uRnu \in \mathbb{R}^n, the univariate restriction f(teu)R[t]f(te - u) \in \mathbb{R}[t] has only real zeros. The hyperbolicity cone is then

Ce(f)={uRn:f(teu)0 t<0}.C_e(f) = \{ u \in \mathbb{R}^n : f(te - u) \neq 0\ \forall t < 0 \}.

These cones are closed, convex, and generalize the positive orthant and the positive semidefinite cone. Many optimization paradigms, such as hyperbolic programming and interior-point algorithms, operate naturally over such cones (Renegar, 2010).

2. Nash-Smoothness and Boundary Structure

A boundary point vKv \in \partial K of a closed convex semialgebraic set KRnK \subset \mathbb{R}^n is Nash-smooth if there exists a Nash function hh defined in a neighborhood UU of vv such that hh is CC^\infty, h(v)0\nabla h(v) \neq 0, and KU={xU:h(x)0}K \cap U = \{ x \in U : h(x) \geq 0 \}. Strict positive curvature at vv is the requirement that for every nonzero ww orthogonal to h(v)\nabla h(v),

wTHessh(v)w<0,w^T \operatorname{Hess}_h(v) w < 0,

where Hessh(v)\operatorname{Hess}_h(v) is the Hessian of hh at vv. This structure is more general than polynomial (algebraic) smoothness, permitting semialgebraic and analytic boundaries.

3. Representation: Spectrahedral Shadows and Second-Order Cones

Netzer and Sanyal initially demonstrated that every smooth hyperbolicity cone is a spectrahedral shadow, i.e., the linear projection of a spectrahedral cone (Netzer et al., 2012). The Nash-smooth condition comprises a weaker, more general regularity requirement. The principal result of (Scheiderer, 21 Sep 2025) shows that every Nash-smooth hyperbolicity cone admits a lifted linear matrix inequality (LMI) representation where each block is at most 2×22 \times 2, i.e.,

Ce(f)={xRn:  yRm with i=1nxiAi+j=1myjBj0},C_e(f) = \{ x \in \mathbb{R}^n :\ \exists\ y \in \mathbb{R}^m\ \text{with}\ \sum_{i=1}^n x_i A_i + \sum_{j=1}^m y_j B_j \succeq 0 \},

with all diagonal blocks of size 2\leq 2. This is equivalent to second-order cone representability (SOCR), a stronger result than spectrahedral shadowhood, as SOCPs are computationally preferable to general SDPs.

Representational Class Block Size Applicability
Spectrahedral cone Arbitrary All smooth cones
Spectrahedral shadow (projection) Arbitrary Smooth/Nash-smooth
Second-order cone (SOCR) 2×2\leq 2 \times 2 Nash-smooth with strict positive curvature

4. Derivative Relaxations, Central Swaths, and Cone Families

A central theme in (Renegar, 2010, Lourenço et al., 2021) is the construction of nested families of cones via derivative relaxations: Λ+(p,e), Λ+(1), Λ+(2), , Λ+(d1),\Lambda_+(p, e),\ \Lambda_+^{(1)},\ \Lambda_+^{(2)},\ \dots,\ \Lambda_+^{(d-1)}, with each Λ+(m)\Lambda_+^{(m)} defined by Demp(x)0D_e^m p(x) \geq 0, Dem+1p(x)0,D_e^{m+1} p(x) \geq 0, \dots. These relaxations form a hierarchy: Λ+Λ+(1)Λ+(2)Rn,\Lambda_+ \subseteq \Lambda_+^{(1)} \subseteq \Lambda_+^{(2)} \subseteq \dots \subseteq \mathbb{R}^n, and facilitate continuous paths (“central swaths”) to optimality in interior-point frameworks. The strict Nash-smoothness ensures favorable analytic and geometric properties for each relaxation and its faces: every face of a Nash-smooth hyperbolicity cone is itself a hyperbolicity cone or intersected with a suitable derivative relaxation.

5. Tensor Evaluation and Proof Techniques

The proof of SOCR for Nash-smooth cones in (Scheiderer, 21 Sep 2025) uses tensor evaluation. For a linear polynomial ff nonnegative on KK, one constructs the tensor f(ξ)RRf^{\otimes}(\xi) \in R \otimes R, with criteria stating SOCR if, for every tangent vector TvT_v at a boundary point ξ\xi (from Nash function hh), the sum-of-squares invariant (sosx) is 2\leq 2. This is achieved by expressing the positive tangent as a sum of squares of diagonal tensors with positive coefficients in a local étale cover, using Nash function lifts and Kähler differentials.

6. Algorithmic and Optimization Implications

SOCR implies that conic optimization over Nash-smooth hyperbolicity cones can be implemented with SOCPs, greatly improving efficiency. Interior-point and Frank-Wolfe-type algorithms can be tailored to exploit this structure, given that both primal and dual projections, as well as minimum eigenvalue computations, become tractable or even admit closed-form formulas in certain cases (Nagano et al., 12 Jul 2024). The amenability established for hyperbolicity cones (Lourenço et al., 2021) ensures robust error bounds and smoothness, critical for stability in conic feasibility and Nash equilibrium problems.

7. Comparison, Extensions, and Applications

Earlier work on spectrahedrality or spectrahedral shadows for hyperbolicity cones (Brändén, 2012, Amini, 2016, Saunderson, 2017) applied to cones with stricter algebraic smoothness. The Nash-smooth result generalizes this framework to cones defined by Nash functions and strict positive curvature, broadening the applicability to real-analytic, semialgebraic boundaries. This has relevance in convex algebraic geometry, structural optimization, and game theory, where regularity and tractable representations (SOCP over cones with Nash-smooth boundaries) accelerate equilibrium computation and facilitate robust mathematical modeling.

Summary

A Nash-Smooth Hyperbolicity Cone is a convex hyperbolicity cone defined by a real homogeneous polynomial, whose boundary locally admits a Nash function with nonzero gradient and strict positive curvature. All such cones have a block-diagonal LMI representation with 2×22 \times 2 blocks, i.e., they are second-order cone representable (Scheiderer, 21 Sep 2025). The cone’s structure is stable under derivative relaxations, lends itself to robust optimization via SOCP, and ensures favorable analytic and geometric properties for all faces and nested relaxations. These results extend the reach of spectrahedrality and enable efficient algorithms for optimization and equilibrium computation in diverse advanced research domains.

Forward Email Streamline Icon: https://streamlinehq.com

Follow Topic

Get notified by email when new papers are published related to Nash-Smooth Hyperbolicity Cone.

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube