Condor Refine: High-Precision Conic Optimization
- Condor Refine is a solution-polishing method that improves conic program accuracy by achieving 10× to 100× residual norm reductions using a matrix-free, Newton-like approach.
- It leverages the homogeneous primal–dual embedding and analytic derivatives of cone projections to ensure efficiency and robustness in large-scale optimization problems.
- The method integrates an LSQR iterative least-squares solver with a line search strategy to deliver certificate-grade solutions at marginal additional computational cost.
Condor Refine is a lightweight, high-precision solution-polishing step for conic optimization problems, focused on substantially improving the accuracy of approximate solutions obtained by standard first-order or operator-splitting methods. Condor Refine operates at regular points of the homogeneous primal-dual embedding of conic programs, using a matrix-free Newton-like approach implemented with the LSQR conjugate gradient solver and analytic derivatives of projections onto the relevant cones. This method is efficient, robust, and compatible with large-scale or abstract-operator problem settings, consistently yielding 10× to 100× reductions in residual norm at marginal computational cost (Busseti et al., 2018).
1. Homogeneous Primal–Dual Embedding and Residual Formulation
Condor Refine is constructed for the solution of generic conic programs in primal-dual form:
- Primal: subject to ,
- Dual: subject to ,
The homogeneous self-dual embedding introduces auxiliary variables and combines primal and dual feasibility into a single feasibility problem: $u-v=0,\qquad u\in\tilde\K,\;v\in\tilde\K^*,\;u_{m+n+1}+v_{m+n+1}>0$ where
$\tilde\K = \mathbb{R}^n \times \mathcal{K}^* \times \mathbb{R}_+, \quad \tilde{\mathcal{K}}^* = \{0\}^n \times \mathcal{K} \times \mathbb{R}_+$
and is skew-symmetric, encoding , 0, 1.
The residual map is then
2
where 3 is the Euclidean projection onto 4 and 5. Any 6 such that 7 and 8 corresponds exactly to a certificate or solution of the original conic program (Busseti et al., 2018).
2. Differentiability and Jacobian Structure
At a “regular” point—where 9 is differentiable and its Jacobian is invertible—the method employs a Newton-like refinement:
0
Here 1, the derivative of the projection, is block-diagonal across the Cartesian product structure of 2, allowing per-cone analytic forms:
- Nonnegative orthant: 3
- Second-order cone: Formulae based on the value of 4 vs. 5, including full derivatives for points on or near the boundary
- Semidefinite cone: Uses the eigendecomposition 6 and structured Jacobian expressions in terms of 7, 8
- Exponential cone: Involves solving a 9 KKT system for the local projection Jacobian (Busseti et al., 2018)
This modular structure permits efficient, local application of cone-derivative routines without constructing or storing large Jacobian matrices.
3. Matrix-Free Refinement via LSQR
Given an approximate 0 with 1, one solves the regularized normal equations:
2
which is equivalent to
3
Condor Refine leverages the LSQR algorithm—a matrix-free, iterative least-squares solver. Only the action of 4 and 5 as matvecs is required, which are applied as follows:
- Compute 6 per cone block for a vector
- Matricially multiply by 7 via operator calls to 8 and 9 as needed
- Add the identity term
A line search in 0 is employed to guarantee monotonic reduction in residual norm. The procedure is optionally iterated a small number of times (Busseti et al., 2018).
Condor Refine Pseudocode
9
4. Computational Efficiency and Scaling
Each LSQR iteration requires one application of 1 and one of 2, corresponding to one call each to 3, 4, the projection 5, and its derivative 6. Default values are 7 and typically two refinement cycles (i.e., 8 applications total).
Comparative metrics:
- Condor Refine: 5–20 ms per moderate-sized, sparse conic instance
- Full solve: 100 ms to 1 s for the original optimization using first-order or operator-splitting solvers
- Interior-point methods: Incur dense- or sparse-matrix factors scaling as 9 or 0 per iteration
Thus, Condor Refine achieves substantial accuracy improvement at a small computational fraction of the original solution effort (Busseti et al., 2018).
5. Implementation Design and Reference Code
The open-source implementation (Python, see https://github.com/cvxgrp/cone_prog_refine) employs:
- SciPy CSC sparse matrices for 1, NumPy arrays for 2, 3
- A Python dict structure to describe 4 cones
- Numba JIT-compiled projection and projection-derivative routines for efficiency
- Built-in SciPy LSQR with custom matvec/adjoints, default tolerance 5
- Levenberg–Marquardt stabilization parameter 6
- Up to 10 line-search steps, two refinement passes
This approach ensures the method and its extensions are compatible with both explicit and abstract linear operators (Busseti et al., 2018).
6. Empirical Results and Operational Characteristics
Numerical experiments on instances with mixed linear, second-order, semidefinite, and exponential cones demonstrate:
- Residual norms are improved by 7 to 8 in the majority of cases
- Success with limited passes of refinement and negligible increase in runtime relative to the cost of acquiring the original approximate answer
The regularity assumption (Jacobian invertibility) is typically satisfied in practical instances; nonregular points are rare in contemporary applications.
Condor Refine is positioned as a robust, systematic polishing layer that can be appended to most first-order or embedded conic solution routines. It delivers substantial value in obtaining certificate-grade solutions in high-precision or reliability-critical deployments (Busseti et al., 2018).