Papers
Topics
Authors
Recent
Search
2000 character limit reached

Condor Refine: High-Precision Conic Optimization

Updated 7 April 2026
  • Condor Refine is a solution-polishing method that improves conic program accuracy by achieving 10× to 100× residual norm reductions using a matrix-free, Newton-like approach.
  • It leverages the homogeneous primal–dual embedding and analytic derivatives of cone projections to ensure efficiency and robustness in large-scale optimization problems.
  • The method integrates an LSQR iterative least-squares solver with a line search strategy to deliver certificate-grade solutions at marginal additional computational cost.

Condor Refine is a lightweight, high-precision solution-polishing step for conic optimization problems, focused on substantially improving the accuracy of approximate solutions obtained by standard first-order or operator-splitting methods. Condor Refine operates at regular points of the homogeneous primal-dual embedding of conic programs, using a matrix-free Newton-like approach implemented with the LSQR conjugate gradient solver and analytic derivatives of projections onto the relevant cones. This method is efficient, robust, and compatible with large-scale or abstract-operator problem settings, consistently yielding 10× to 100× reductions in residual norm at marginal computational cost (Busseti et al., 2018).

1. Homogeneous Primal–Dual Embedding and Residual Formulation

Condor Refine is constructed for the solution of generic conic programs in primal-dual form:

  • Primal: min  cTx\min\;c^Tx subject to Ax+s=bA x + s = b, sKs \in \mathcal{K}
  • Dual: min  bTy\min\;b^Ty subject to ATy+c=0A^Ty + c = 0, yKy \in \mathcal{K}^*

The homogeneous self-dual embedding introduces auxiliary variables and combines primal and dual feasibility into a single feasibility problem: $u-v=0,\qquad u\in\tilde\K,\;v\in\tilde\K^*,\;u_{m+n+1}+v_{m+n+1}>0$ where

$\tilde\K = \mathbb{R}^n \times \mathcal{K}^* \times \mathbb{R}_+, \quad \tilde{\mathcal{K}}^* = \{0\}^n \times \mathcal{K} \times \mathbb{R}_+$

and QR(m+n+1)×(m+n+1)Q\in\mathbb{R}^{(m+n+1)\times(m+n+1)} is skew-symmetric, encoding AA, Ax+s=bA x + s = b0, Ax+s=bA x + s = b1.

The residual map is then

Ax+s=bA x + s = b2

where Ax+s=bA x + s = b3 is the Euclidean projection onto Ax+s=bA x + s = b4 and Ax+s=bA x + s = b5. Any Ax+s=bA x + s = b6 such that Ax+s=bA x + s = b7 and Ax+s=bA x + s = b8 corresponds exactly to a certificate or solution of the original conic program (Busseti et al., 2018).

2. Differentiability and Jacobian Structure

At a “regular” point—where Ax+s=bA x + s = b9 is differentiable and its Jacobian is invertible—the method employs a Newton-like refinement:

sKs \in \mathcal{K}0

Here sKs \in \mathcal{K}1, the derivative of the projection, is block-diagonal across the Cartesian product structure of sKs \in \mathcal{K}2, allowing per-cone analytic forms:

  • Nonnegative orthant: sKs \in \mathcal{K}3
  • Second-order cone: Formulae based on the value of sKs \in \mathcal{K}4 vs. sKs \in \mathcal{K}5, including full derivatives for points on or near the boundary
  • Semidefinite cone: Uses the eigendecomposition sKs \in \mathcal{K}6 and structured Jacobian expressions in terms of sKs \in \mathcal{K}7, sKs \in \mathcal{K}8
  • Exponential cone: Involves solving a sKs \in \mathcal{K}9 KKT system for the local projection Jacobian (Busseti et al., 2018)

This modular structure permits efficient, local application of cone-derivative routines without constructing or storing large Jacobian matrices.

3. Matrix-Free Refinement via LSQR

Given an approximate min  bTy\min\;b^Ty0 with min  bTy\min\;b^Ty1, one solves the regularized normal equations:

min  bTy\min\;b^Ty2

which is equivalent to

min  bTy\min\;b^Ty3

Condor Refine leverages the LSQR algorithm—a matrix-free, iterative least-squares solver. Only the action of min  bTy\min\;b^Ty4 and min  bTy\min\;b^Ty5 as matvecs is required, which are applied as follows:

  • Compute min  bTy\min\;b^Ty6 per cone block for a vector
  • Matricially multiply by min  bTy\min\;b^Ty7 via operator calls to min  bTy\min\;b^Ty8 and min  bTy\min\;b^Ty9 as needed
  • Add the identity term

A line search in ATy+c=0A^Ty + c = 00 is employed to guarantee monotonic reduction in residual norm. The procedure is optionally iterated a small number of times (Busseti et al., 2018).

Condor Refine Pseudocode

yKy \in \mathcal{K}^*9

4. Computational Efficiency and Scaling

Each LSQR iteration requires one application of ATy+c=0A^Ty + c = 01 and one of ATy+c=0A^Ty + c = 02, corresponding to one call each to ATy+c=0A^Ty + c = 03, ATy+c=0A^Ty + c = 04, the projection ATy+c=0A^Ty + c = 05, and its derivative ATy+c=0A^Ty + c = 06. Default values are ATy+c=0A^Ty + c = 07 and typically two refinement cycles (i.e., ATy+c=0A^Ty + c = 08 applications total).

Comparative metrics:

  • Condor Refine: 5–20 ms per moderate-sized, sparse conic instance
  • Full solve: 100 ms to 1 s for the original optimization using first-order or operator-splitting solvers
  • Interior-point methods: Incur dense- or sparse-matrix factors scaling as ATy+c=0A^Ty + c = 09 or yKy \in \mathcal{K}^*0 per iteration

Thus, Condor Refine achieves substantial accuracy improvement at a small computational fraction of the original solution effort (Busseti et al., 2018).

5. Implementation Design and Reference Code

The open-source implementation (Python, see https://github.com/cvxgrp/cone_prog_refine) employs:

  • SciPy CSC sparse matrices for yKy \in \mathcal{K}^*1, NumPy arrays for yKy \in \mathcal{K}^*2, yKy \in \mathcal{K}^*3
  • A Python dict structure to describe yKy \in \mathcal{K}^*4 cones
  • Numba JIT-compiled projection and projection-derivative routines for efficiency
  • Built-in SciPy LSQR with custom matvec/adjoints, default tolerance yKy \in \mathcal{K}^*5
  • Levenberg–Marquardt stabilization parameter yKy \in \mathcal{K}^*6
  • Up to 10 line-search steps, two refinement passes

This approach ensures the method and its extensions are compatible with both explicit and abstract linear operators (Busseti et al., 2018).

6. Empirical Results and Operational Characteristics

Numerical experiments on instances with mixed linear, second-order, semidefinite, and exponential cones demonstrate:

  • Residual norms are improved by yKy \in \mathcal{K}^*7 to yKy \in \mathcal{K}^*8 in the majority of cases
  • Success with limited passes of refinement and negligible increase in runtime relative to the cost of acquiring the original approximate answer

The regularity assumption (Jacobian invertibility) is typically satisfied in practical instances; nonregular points are rare in contemporary applications.

Condor Refine is positioned as a robust, systematic polishing layer that can be appended to most first-order or embedded conic solution routines. It delivers substantial value in obtaining certificate-grade solutions in high-precision or reliability-critical deployments (Busseti et al., 2018).

Definition Search Book Streamline Icon: https://streamlinehq.com
References (1)

Topic to Video (Beta)

No one has generated a video about this topic yet.

Whiteboard

No one has generated a whiteboard explanation for this topic yet.

Follow Topic

Get notified by email when new papers are published related to Condor Refine.