Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
133 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Inexact Augmented Lagrangian Methods for Conic Programs: Quadratic Growth and Linear Convergence (2410.22683v1)

Published 30 Oct 2024 in math.OC, cs.SY, and eess.SY

Abstract: Augmented Lagrangian Methods (ALMs) are widely employed in solving constrained optimizations, and some efficient solvers are developed based on this framework. Under the quadratic growth assumption, it is known that the dual iterates and the Karush-Kuhn-Tucker (KKT) residuals of ALMs applied to semidefinite programs (SDPs) converge linearly. In contrast, the convergence rate of the primal iterates has remained elusive. In this paper, we resolve this challenge by establishing new $\textit{quadratic growth}$ and $\textit{error bound}$ properties for primal and dual SDPs under the strict complementarity condition. Our main results reveal that both primal and dual iterates of the ALMs converge linearly contingent solely upon the assumption of strict complementarity and a bounded solution set. This finding provides a positive answer to an open question regarding the asymptotically linear convergence of the primal iterates of ALMs applied to semidefinite optimization.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (56)
  1. Linear matrix inequalities in system and control theory. SIAM, 1994.
  2. Renata Sotirov. SDP relaxations for some combinatorial optimization problems. In Handbook on Semidefinite, Conic and Polynomial Optimization, pages 795–819. Springer, 2012.
  3. Semidefinite optimization and convex algebraic geometry. SIAM, 2012.
  4. Learning the kernel matrix with semidefinite programming. Journal of Machine learning research, 5(Jan):27–72, 2004.
  5. Semidefinite relaxations for certifying robustness to adversarial examples. Advances in neural information processing systems, 31, 2018.
  6. Efficient neural network verification via layer-based semidefinite relaxations and linear cuts. In Zhi-Hua Zhou, editor, Proceedings of the Thirtieth International Joint Conference on Artificial Intelligence, IJCAI-21, pages 2184–2190. International Joint Conferences on Artificial Intelligence Organization, 8 2021. Main Track.
  7. Applications of semidefinite programming. Applied Numerical Mathematics, 29(3):283–299, 1999.
  8. Semidefinite programming. SIAM review, 38(1):49–95, 1996.
  9. Farid Alizadeh. Interior point methods in semidefinite programming with applications to combinatorial optimization. SIAM journal on Optimization, 5(1):13–51, 1995.
  10. Handbook of semidefinite programming: theory, algorithms, and applications, volume 27. Springer Science & Business Media, 2012.
  11. Chordal and factor-width decompositions for scalable semidefinite and polynomial optimization. Annual Reviews in Control, 52:243–279, 2021.
  12. Alternating direction augmented Lagrangian methods for semidefinite programming. Math. Program. Comput., 2(3):203–230, 2010.
  13. Conic optimization via operator splitting and homogeneous self-dual embedding. Journal of Optimization Theory and Applications, 169(3):1042–1068, June 2016.
  14. Chordal decomposition in operator-splitting methods for sparse semidefinite programs. Math. Program., 180(1):489–532, 2020.
  15. Magnus R Hestenes. Multiplier and gradient methods. Journal of optimization theory and applications, 4(5):303–320, 1969.
  16. Michael JD Powell. A method for nonlinear constraints in minimization problems. Optimization, pages 283–298, 1969.
  17. A nonlinear programming algorithm for solving semidefinite programs via low-rank factorization. Math. Program., 95(2):329–357, 2003.
  18. A Newton-CG augmented Lagrangian method for semidefinite programming. SIAM J. Optim., 20(4):1737–1765, 2010.
  19. Sdpnal+: a majorized semismooth Newton-CG augmented Lagrangian method for semidefinite programming with nonnegative constraints. Math. Program. Computation, 7(3):331–366, 2015.
  20. Scalable semidefinite programming. SIAM Journal on Mathematics of Data Science, 3(1):171–200, 2021.
  21. Augmented Lagrangian methods for convex matrix optimization problems. Journal of the Operations Research Society of China, 10(2):305–342, 2022.
  22. An inexact augmented Lagrangian method for second-order cone programming with applications. SIAM J. Optim., 31(3):1748–1773, 2021.
  23. R Tyrrell Rockafellar. Augmented Lagrangians and applications of the proximal point algorithm in convex programming. Mathematics of operations research, 1(2):97–116, 1976.
  24. R Tyrell Rockafellar. The multiplier method of hestenes and powell applied to convex programming. Journal of Optimization Theory and applications, 12(6):555–562, 1973.
  25. Fernando Javier Luque. Asymptotic convergence analysis of the proximal point algorithm. SIAM Journal on Control and Optimization, 22(2):277–293, 1984.
  26. On the R-superlinear convergence of the KKT residuals generated by the augmented Lagrangian method for convex composite conic programming. Math. Program., 178:381–415, 2019.
  27. On the asymptotic superlinear convergence of the augmented Lagrangian method for semidefinite programming with multiple solutions. arXiv preprint arXiv:1610.00875, 2016.
  28. A strict complementarity approach to error bound and sensitivity of solution of conic programs. Optimization Letters, 17(7):1551–1574, 2023.
  29. An overview and comparison of spectral bundle methods for primal and dual semidefinite programs. arXiv preprint arXiv:2307.07651, 2023.
  30. Revisiting spectral bundle methods: Primal-dual (sub) linear convergence rates. SIAM J. Optim., 33(2):1305–1332, 2023.
  31. Quadratic growth conditions for convex matrix optimization problems associated with spectral functions. SIAM J. Optim., 27(4):2332–2355, 2017.
  32. Jos F Sturm. Error bounds for linear matrix inequalities. SIAM J. Optim., 10(4):1228–1248, 2000.
  33. Shuzhong Zhang. Global error bounds for convex conic problems. SIAM J. Optim., 10(3):836–851, 2000.
  34. Error bounds, quadratic growth, and linear convergence of proximal methods. Mathematics of Operations Research, 43(3):919–948, 2018.
  35. Zirui Zhou and Anthony Man-Cho So. A unified approach to error bounds for structured convex optimization problems. Math. Program., 165:689–728, 2017.
  36. Alan J Hoffman. On approximate solutions of systems of linear inequalities. Journal of Research of the National Bureau of Standards, 49(4), 1952.
  37. A unified analysis of hoffman’s bound via fenchel duality. SIAM J. Optim., 6(2):265–282, 1996.
  38. R Tyrrell Rockafellar. A dual approach to solving nonlinear programming problems by unconstrained optimization. Math. Program., 5(1):354–373, 1973.
  39. Genericity results in linear conic programming—a tour d’horizon. Mathematics of operations research, 42(1):77–94, 2017.
  40. Complementarity and nondegeneracy in semidefinite programming. Math. Program., 77(1):111–128, 1997.
  41. On the simplicity and conditioning of low rank semidefinite programs. SIAM J. Optim., 31(4):2614–2637, 2021.
  42. R Tyrrell Rockafellar. Monotone operators and the proximal point algorithm. SIAM journal on control and optimization, 14(5):877–898, 1976.
  43. D Leventhal. Metric subregularity and the proximal point method. Journal of Mathematical Analysis and Applications, 360(2):681–688, 2009.
  44. Error bounds, PL condition, and quadratic growth for weakly convex functions, and linear convergences of proximal point methods. arXiv preprint arXiv:2312.16775, 2023.
  45. Andrzej Ruszczynski. Nonlinear optimization. Princeton university press, 2011.
  46. Jong-Shi Pang. Error bounds in math. program. Math. Program., 79(1-3):299–332, 1997.
  47. Linear convergence of first order methods for non-strongly convex optimization. Math. Program., 175:69–107, 2019.
  48. Characterization of metric regularity of subdifferentials. Journal of Convex Analysis, 15(2):365, 2008.
  49. R Tyrrell Rockafellar and Roger J-B Wets. Variational analysis, volume 317. Springer Science & Business Media, 2009.
  50. Improved approximation algorithms for maximum cut and satisfiability problems using semidefinite programming. Journal of the ACM (JACM), 42(6):1115–1145, 1995.
  51. R Tyrrell Rockafellar. Convex analysis, volume 11. Princeton university press, 1997.
  52. Eduardo H Zarantonello. Projections on convex sets in Hilbert space and spectral theory: Part i. Projections on convex sets: Part ii. Spectral theory. In Contributions to nonlinear functional analysis, pages 237–424. Elsevier, 1971.
  53. On projection algorithms for solving convex feasibility problems. SIAM review, 38(3):367–426, 1996.
  54. Strong conical hull intersection property, bounded linear regularity, Jameson’s property (G), and error bounds in convex optimization. Math. Program., 86:135–160, 1999.
  55. Michael L Overton. Large-scale optimization of eigenvalues. SIAM J. Optim., 2(1):88–120, 1992.
  56. MOSEK ApS. The MOSEK optimization toolbox for MATLAB manual. Version 9.0., 2019.

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com