Papers
Topics
Authors
Recent
Search
2000 character limit reached

Primal-Dual Interior-Point Methods

Updated 17 April 2026
  • Primal-dual interior-point methods are algorithms for convex optimization that update primal and dual variables along a barrier-defined central path.
  • They leverage self-concordant barriers and Riemannian metrics to ensure robust global convergence and efficient short-step iterations.
  • Extensions include applications to hyperbolic cone programming, Gaussian quadrature for scaling, and links to quasi-Newton updates.

A primal-dual interior-point method (PDIPM) is a class of algorithms for solving convex optimization problems that simultaneously update both primal and dual variables by following a central path defined by barrier-augmented KKT conditions. These methods extend the original “primal IPM” framework to provide robust path-tracking, efficient global convergence, and analytic complexity guarantees across a broad spectrum of conic, hyperbolic, and nonlinear programming classes. Modern developments in this area have introduced generalizations to hyperbolic cone programming, refined the theory of primal-dual metrics, and connected algorithmic operations to Riemannian geometry, Gaussian quadrature, and quasi-Newton updates (Myklebust et al., 2014).

1. Mathematical Foundations: Primal-Dual Formulation and Self-Concordant Barriers

Primal-dual interior-point methods are defined for convex optimization problems in canonical conic form:

  • Primal: minc,x\min\langle c,x\rangle  s.t. Ax=bA x = b,  xKx\in K
  • Dual: maxb,y\max\langle b,y\rangle  s.t. Ay+s=cA^* y+s=c,  sKs\in K^*

Here KEK\subset E is a closed, pointed convex cone, AA is surjective, and (x,s)(x,s) are primal/dual variables. A θ\theta-logarithmically homogeneous self-concordant barrier (LHSCB) Ax=bA x = b0 satisfies:

  • Ax=bA x = b1
  • Ax=bA x = b2

The Hessian Ax=bA x = b3 induces a Riemannian metric Ax=bA x = b4, and the conjugate barrier Ax=bA x = b5 defines an analogous metric on Ax=bA x = b6 (Myklebust et al., 2014).

2. Construction of Local Primal-Dual Metrics and Scaling Operators

A central feature of PDIPMs is the introduction of local “scaling” operators Ax=bA x = b7, mapping Ax=bA x = b8 to Ax=bA x = b9, that facilitate a symmetric treatment of the primal and dual iterates:

  • xKx\in K0
  • xKx\in K1

Families of admissible scaling operators xKx\in K2 are indexed by tightness of matrix inequalities involving xKx\in K3 and xKx\in K4. For example, the xKx\in K5 family is defined such that xKx\in K6 satisfies

xKx\in K7

with xKx\in K8, xKx\in K9, and maxb,y\max\langle b,y\rangle0 the optimal value of an auxiliary SDP (Myklebust et al., 2014).

3. Short-Step Algorithmic Framework and Iteration Complexity

The short-step PDIPM alternates predictor and corrector steps within a neighborhood of the central path. For a chosen scaling maxb,y\max\langle b,y\rangle1 and centering parameter maxb,y\max\langle b,y\rangle2, the Newton system in scaled variables maxb,y\max\langle b,y\rangle3 is

maxb,y\max\langle b,y\rangle4

This leads (after block-matrix assembly) to a system solved for maxb,y\max\langle b,y\rangle5, followed by a primal-dual update. The step size is chosen so the next iterate remains in the cone interiors.

Key properties:

  • Predictor (maxb,y\max\langle b,y\rangle6): reduces gap maxb,y\max\langle b,y\rangle7 by a constant fraction in maxb,y\max\langle b,y\rangle8 iterations, keeping proximity measure small.
  • Corrector (maxb,y\max\langle b,y\rangle9, Ay+s=cA^* y+s=c0): does not reduce Ay+s=cA^* y+s=c1 but decreases proximity quadratically.

Iteration complexity matches the Nesterov–Todd bound for symmetric cone PDIP: Ay+s=cA^* y+s=c2 to drive the duality gap below Ay+s=cA^* y+s=c3 (Myklebust et al., 2014).

4. Extensions: Hyperbolic Cone Programming, Integral Scaling, and Gaussian Quadrature

For hyperbolic cones (e.g., positive semidefinite cones associated with hyperbolic polynomials), the primal barrier Ay+s=cA^* y+s=c4 admits favorable Hessian estimation properties along segments. Notably, two integral scaling constructs generalize the classical Nesterov–Todd metric:

  • Dual integral scaling: Ay+s=cA^* y+s=c5
  • Primal integral scaling: Involves integrating Ay+s=cA^* y+s=c6, where the integrand is a rational function in Ay+s=cA^* y+s=c7; this is computed via exact Gaussian quadrature or truncated rules, yielding a computable scaling with provable error guarantees.

Such constructions preserve short-step iteration complexity and exploit cone structure efficiently (Myklebust et al., 2014).

5. Connections to Riemannian Geometry, Operator Means, and Quasi-Newton Theory

The set of admissible primal-dual metrics Ay+s=cA^* y+s=c8 is geodesically convex in the manifold of positive-definite operators, establishing connections to the Riemannian geometry of self-concordant barriers. The integral scalings can be interpreted as arithmetic means of Hessians (averages over operator spaces), complementary to the Nesterov–Todd geometric mean. Gaussian quadrature, leveraging the polynomial structure of Ay+s=cA^* y+s=c9, provides computationally efficient and accurate approximations to such operator integrals.

Further, when using midpoint or mean approximations, well-chosen quasi-Newton updates (of DFP or BFGS type) restore the primal-dual metric equations sKs\in K^*0, which aligns interior-point metric updates with classical variable-metric (quasi-Newton) optimization methods. Norm bounds show that such corrections remain controlled within the central-path neighborhood (Myklebust et al., 2014).

6. Algorithmic Impact and Extensions Beyond Symmetric Cones

The developed primal-dual methodology:

  • Establishes broad families of short-step PDIPs with explicit local metrics for all self-concordant barriers,
  • Extends iteration complexity sKs\in K^*1 beyond symmetric cones to general convex cones,
  • Exploits favorable structures of hyperbolic barriers and allows efficient metric computation via operator quadrature,
  • Provides geometric and algorithmic ties to other classes of first-order and variable-metric methods.

These advances substantially widen the scope of guaranteed-efficient interior-point technology in convex optimization, set a unified analytic foundation across metric choices, and enable performance gains in conic and hyperbolic programming settings with complicated barrier geometry (Myklebust et al., 2014).

Definition Search Book Streamline Icon: https://streamlinehq.com
References (1)

Topic to Video (Beta)

No one has generated a video about this topic yet.

Whiteboard

No one has generated a whiteboard explanation for this topic yet.

Follow Topic

Get notified by email when new papers are published related to Primal-Dual Interior-Point Methods.