Computational Guarantees for Restarted PDHG for LP based on "Limiting Error Ratios" and LP Sharpness (2312.14774v4)
Abstract: In recent years, there has been growing interest in solving linear optimization problems - or more simply "LP" - using first-order methods in order to avoid the costly matrix factorizations of traditional methods for huge-scale LP instances. The restarted primal-dual hybrid gradient method (PDHG) - together with some heuristic techniques - has emerged as a powerful tool for solving huge-scale LPs. However, the theoretical understanding of the restarted PDHG and the validation of various heuristic implementation techniques are still very limited. Existing complexity analyses have relied on the Hoffman constant of the LP KKT system, which is known to be overly conservative, difficult to compute, and fails to offer insight into instance-specific characteristics of the LP problems. These limitations have limited the capability to discern which characteristics of LP instances lead to easy versus difficult instances. With the goal of overcoming these limitations, we introduce and develop two purely geometry-based condition measures for LP instances: "limiting error ratio" and LP sharpness. We provide new computational guarantees for the restarted PDHG based on these two condition measures. For limiting error ratio, we provide a computable upper bound and show its relationship with the data instance's proximity to infeasibility under perturbation. For LP sharpness, we prove its equivalence to the stability of the LP optimal solution set under perturbation of the objective function. Our computational guarantees are validated by constructed instances. Conversely, our computational guarantees validate the practical efficacy of certain heuristic techniques (row preconditioners and step-size tuning) that improve computational performance. Finally, we present computational experiments on LP relaxations from the MIPLIB dataset that demonstrate the promise of various implementation strategies.
- Practical large-scale linear programming using primal-dual hybrid gradient. In Advances in Neural Information Processing Systems, volume 34, pages 20243–20257, 2021.
- Infeasibility detection with primal-dual hybrid gradient for large-scale linear programming. arXiv preprint arXiv:2102.04592, 2021.
- Faster first-order primal-dual methods for linear programming using restarts and sharpness. Mathematical Programming, 201(1-2):133–184, 2023.
- D. Avis and K. Fukuda. A pivoting algorithm for convex hulls and vertex enumeration of arrangements and polyhedra. Discrete & Computational Geometry, 8(3):295–313, 1992.
- C. Bergthaller and I. Singer. The distance to a polyhedron. Linear Algebra and its Applications, 169:111–129, 1992.
- D. Bertsimas and J. N. Tsitsiklis. Introduction to Linear Optimization. Athena Scientific, Belmont, MA, 1997.
- A. Chambolle and T. Pock. A first-order primal-dual algorithm for convex problems with applications to imaging. Journal of Mathematical Imaging and Vision, 40:120–145, 2011.
- A. Chambolle and T. Pock. On the ergodic convergence rates of a first-order primal–dual algorithm. Mathematical Programming, 159(1-2):253–287, 2016.
- G. Dantzig. Linear Programming and Extensions. Princeton University Press, 1963.
- New developments of ADMM-based interior point methods for linear programming and conic programming. arXiv preprint arXiv:2209.01793, 2022.
- M. Epelman and R. M. Freund. Condition number complexity of an elementary algorithm for computing a reliable solution of a conic linear system. Mathematical Programming, 88(3):451–485, 2000.
- A general framework for a class of first order primal-dual algorithms for convex optimization in imaging science. SIAM Journal on Imaging Sciences, 3(4):1015–1046, 2010.
- R. Freund and J. Vera. Some characterizations and properties of the “distance to ill-posedness” and the condition measure of a conic linear system. Mathematical Programming, 86:225–260, 1999.
- R. M. Freund and J. Vera. Equivalence of convex problem geometry and computational complexity in the separation oracle model. Mathematics of Operations Research, 34, 2009.
- MIPLIB 2017: data-driven compilation of the 6th mixed-integer programming library. Mathematical Programming Computation, 13(3):443–490, 2021.
- J.-L. Goffin. The relaxation method for solving systems of linear inequalities. Mathematics of Operations Research, 5(3):388–414, 1980.
- B. Grünbaum. Convex Polytopes. Wiley, New York, 1967.
- Gurobi Optimization, LLC. Gurobi optimizer reference manual, 2023.
- A. J. Hoffman. On approximate solutions of systems of linear inequalities. Journal of Research of the National Bureau of Standards, 49(4), 1952.
- H. Hu. Perturbation analysis of global error bounds for systems of linear inequalities. Mathematical Programming, 88:277–284, 2000.
- Generating all vertices of a polyhedron is hard. Discrete & Computational Geometry, 39:174–190, 2008.
- Error bounds for convex inequality systems. In Generalized Convexity, Generalized Monotonicity: Recent Results, pages 75–110. Springer, 1998.
- An asymptotically superlinearly convergent semismooth Newton augmented Lagrangian method for linear programming. SIAM Journal on Optimization, 30(3):2410–2440, 2020.
- Convergence rates with inexact non-expansive operators. Mathematical Programming, 159:403–434, 2016.
- An ADMM-based interior-point method for large-scale linear programming. Optimization Methods and Software, 36(2-3):389–424, 2021.
- H. Lu and J. Yang. Nearly optimal linear convergence of stochastic primal-dual methods for linear programming. arXiv preprint arXiv:2111.05530, 2021.
- H. Lu and J. Yang. On the infimal sub-differential size of primal-dual hybrid gradient method. arXiv preprint arXiv:2206.12061, 2022.
- H. Lu and J. Yang. cuPDLP. jl: A GPU implementation of restarted primal-dual hybrid gradient for linear programming in Julia. arXiv preprint arXiv:2311.12180, 2023.
- H. Lu and J. Yang. On the geometry and refined rate of primal-dual hybrid gradient for linear programming. arXiv preprint arXiv:2307.03664, 2023.
- Z.-Q. Luo and P. Tseng. Perturbation analysis of a condition number for linear systems. SIAM Journal on Matrix Analysis and Applications, 15(2):636–660, 1994.
- O. L. Mangasarian. A condition number for linear inequalities and linear programs. In Proceedings of 6th Symposium über Operations Research, Universität Augsburg, pages 3–15, 1981.
- V. Mirrokni. Google Research, 2022 & beyond: Algorithmic advances. 2023. https://ai.googleblog.com/2023/02/google-research-2022-beyond-algorithmic.html.
- Y. Nesterov and A. Nemirovskii. Interior-point polynomial algorithms in convex programming. SIAM, 1994.
- B. O’Donoghue. Operator splitting for a homogeneous embedding of the linear complementarity problem. SIAM Journal on Optimization, 31(3):1999–2023, 2021.
- Conic optimization via operator splitting and homogeneous self-dual embedding. Journal of Optimization Theory and Applications, 169:1042–1068, 2016.
- J.-S. Pang. Error bounds in mathematical programming. Mathematical Programming, 79(1-3):299–332, 1997.
- An algorithm to compute the Hoffman constant of a system of linear constraints. arXiv preprint arXiv:1804.08418, 2018.
- An algorithm for minimizing the Mumford–Shah functional. In IEEE 12th International Conference on Computer Vision, pages 1133–1140. IEEE, 2009.
- B. T. Polyak. Sharp minima. In Proceedings of the IIASA Workshop on Generalized Lagrangians and Their Applications, Laxenburg, Austria. Institute of Control Sciences Lecture Notes, Moscow, 1979.
- J. Renegar. Some perturbation theory for linear programming. Mathematical Programming, 65(1-3):73–91, 1994.
- E. K. Ryu and W. Yin. Large-Scale Convex Optimization: Algorithms & Analyses via Monotone Operators. Cambridge University Press, 2022.
- G. Sonnevend. An ‘analytic’ center for polyhedrons and new classes of global algorithms for linear (smooth, convex) optimization. Technical report, Department of Numerical Analysis, Institute of Mathematics, Eötvös University, 1088, Budapest, Muzeum Körut 6-8, 1985. Preprint.
- M. J. Todd and Y. Ye. A centered projective algorithm for linear programming. Mathematics of Operations Research, 15(3):508–529, 1990.
- S. J. Wright. Primal-Dual Interior-Point Methods. Society for Industrial and Applied Mathematics (SIAM), Philadelphia, 1997.
- Z. Xiong and R. M. Freund. On the relation between LP sharpness and limiting error ratio and its complexity implications for restarted PDHG. arXiv preprint arXiv:2312.13773, 2023.
- T. Yang and Q. Lin. RSG: Beating subgradient method without smoothness and strong convexity. Journal of Machine Learning Research, 19(1):236–268, 2018.