Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
GPT-4o
Gemini 2.5 Pro Pro
o3 Pro
GPT-4.1 Pro
DeepSeek R1 via Azure Pro
2000 character limit reached

On the Relation Between LP Sharpness and Limiting Error Ratio and Complexity Implications for Restarted PDHG (2312.13773v3)

Published 21 Dec 2023 in math.OC

Abstract: There has been a recent surge in development of first-order methods (FOMs) for solving huge-scale linear programming (LP) problems. The attractiveness of FOMs for LP stems in part from the fact that they avoid costly matrix factorization computation. However, the efficiency of FOMs is significantly influenced - both in theory and in practice - by certain instance-specific LP condition measures. Xiong and Freund recently showed that the performance of the restarted primal-dual hybrid gradient method (PDHG) is predominantly determined by two specific condition measures: LP sharpness and Limiting Error Ratio. In this paper we examine the relationship between these two measures, particularly in the case when the optimal solution is unique (which is generic - at least in theory), and we present an upper bound on the Limiting Error Ratio involving the reciprocal of the LP sharpness. This shows that in LP instances where there is a dual nondegenerate optimal solution, the computational complexity of restarted PDHG can be characterized solely in terms of LP sharpness and the distance to optimal solutions, and simplifies the theoretical complexity upper bound of restarted PDHG for these instances.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (12)
  1. A. Belloni and R. M. Freund. On the symmetry function of a convex set. Mathematical Programming, 111(1-2):57–93, 2008.
  2. D. Bertsimas and J. N. Tsitsiklis. Introduction to Linear Optimization. Athena Scientific Belmont, MA, 1997.
  3. S. P. Boyd and L. Vandenberghe. Convex Optimization. Cambridge University Press, 2004.
  4. M. Epelman and R. M. Freund. A new condition measure, preconditioners, and relations between different measures of conditioning for conic linear systems. SIAM Journal on Optimization, 12(3):627–655, 2002.
  5. R. M. Freund. On the primal-dual geometry of level sets in linear and conic optimization. SIAM Journal on Optimization, 13(4):1004–1013, 2003.
  6. A. J. Hoffman. On approximate solutions of systems of linear inequalities. In Selected Papers Of Alan J Hoffman: With Commentary, pages 174–176. World Scientific, 2003.
  7. J. Renegar. Some perturbation theory for linear programming. Mathematical Programming, 65(1-3):73–91, 1994.
  8. G. W. Stewart. On scaled projections and pseudoinverses. Linear Algebra and its Applications, 112:189–193, 1989.
  9. M. J. Todd. A Dantzig-Wolfe-like variant of Karmarkar’s interior-point linear programming algorithm. Operations Research, 38(6):1006–1018, 1990.
  10. M. J. Todd and Y. Ye. A centered projective algorithm for linear programming. Mathematics of Operations Research, 15(3):508–529, 1990.
  11. S. A. Vavasis and Y. Ye. A primal-dual interior point method whose running time depends only on the constraint matrix. Mathematical Programming, 74(1):79–120, 1996.
  12. Z. Xiong and R. Freund. Compuatational guarantees for restarted PDHG for LP based on “limiting error ratios” and LP sharpness. arXiv preprint arXiv:2312.14774, 2023.
Citations (3)

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.