Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
120 tokens/sec
GPT-4o
10 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
3 tokens/sec
DeepSeek R1 via Azure Pro
55 tokens/sec
2000 character limit reached

Formalization of Complexity Analysis of the First-order Algorithms for Convex Optimization (2403.11437v3)

Published 18 Mar 2024 in math.OC, cs.NA, and math.NA

Abstract: The convergence rate of various first-order optimization algorithms is a pivotal concern within the numerical optimization community, as it directly reflects the efficiency of these algorithms across different optimization problems. Our goal is making a significant step forward in the formal mathematical representation of optimization techniques using the Lean4 theorem prover. We first formalize the gradient for smooth functions and the subgradient for convex functions on a Hilbert space, laying the groundwork for the accurate formalization of algorithmic structures. Then, we extend our contribution by proving several properties of differentiable convex functions that have not yet been formalized in Mathlib. Finally, a comprehensive formalization of these algorithms is presented. These developments are not only noteworthy on their own but also serve as essential precursors to the formalization of a broader spectrum of numerical algorithms and their applications in machine learning as well as many other areas.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (25)
  1. A formalization of convex polyhedra based on the simplex method. J. Automat. Reason., 63(2):323–345, 2019. doi:10.1007/s10817-018-9477-1.
  2. Anne Baanen. Use and abuse of instance parameters in the lean mathematical library. In 13th International Conference on Interactive Theorem Proving, volume 237 of LIPIcs. Leibniz Int. Proc. Inform., pages Art. No. 4, 20. Schloss Dagstuhl. Leibniz-Zent. Inform., Wadern, 2022. doi:10.4230/lipics.itp.2022.4.
  3. Amir Beck. First-order methods in optimization. SIAM, 2017.
  4. A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM journal on imaging sciences, 2(1):183–202, 2009.
  5. Verified reductions for optimization. In Sriram Sankaranarayanan and Natasha Sharygina, editors, Tools and Algorithms for the Construction and Analysis of Systems, pages 74–92, Cham, 2023. Springer Nature Switzerland.
  6. Formalization of real analysis: A survey of proof assistants and libraries. Mathematical Structures in Computer Science, 26(7):1196–1233, 2016.
  7. Proximal alternating linearized minimization for nonconvex and nonsmooth problems. Mathematical Programming, 146(1-2):459–494, 2014.
  8. Optimization methods for large-scale machine learning. SIAM review, 60(2):223–311, 2018.
  9. Subgradient methods. lecture notes of EE392o, Stanford University, Autumn Quarter, 2004(01), 2003.
  10. Convex optimization. Cambridge university press, 2004.
  11. The lean theorem prover (system description). In Automated deduction—CADE 25, volume 9195 of Lecture Notes in Comput. Sci., pages 378–388. Springer, Cham, 2015. URL: https://doi.org/10.1007/978-3-319-21401-6_26, doi:10.1007/978-3-319-21401-6_26.
  12. Sébastien Gouëzel. A formalization of the change of variables formula for integrals in mathlib. In International Conference on Intelligent Computer Mathematics, pages 3–18. Springer, 2022.
  13. Bogdan Grechuk. Lower semicontinuous functions. Archive of Formal Proofs, January 2011. https://isa-afp.org/entries/Lower_Semicontinuous.html, Formal proof development.
  14. Yury Kudryashov. Formalizing the divergence theorem and the Cauchy integral formula in Lean. In 13th International Conference on Interactive Theorem Proving, volume 237 of LIPIcs. Leibniz Int. Proc. Inform., pages Art. No. 23, 19. Schloss Dagstuhl. Leibniz-Zent. Inform., Wadern, 2022. doi:10.4230/lipics.itp.2022.23.
  15. Optimization: Modeling, Algorithm and Theory. higher education press, 2020.
  16. The mathlib Community. The lean mathematical library. In Proceedings of the 9th ACM SIGPLAN International Conference on Certified Programs and Proofs, CPP 2020, page 367–381, New York, NY, USA, 2020. Association for Computing Machinery. doi:10.1145/3372885.3373824.
  17. Yurii Nesterov. A method of solving a convex programming problem with convergence rate o (1/k** 2). Doklady Akademii Nauk SSSR, 269(3):543, 1983.
  18. Yurii Nesterov et al. Lectures on convex optimization, volume 137. Springer, 2018.
  19. A proof assistant for higher-order logic. Lecture Notes in Computer Science, 2002. URL: https://api.semanticscholar.org/CorpusID:59771319.
  20. Numerical optimization. Springer, 1999.
  21. Proximal algorithms. Foundations and trends® in Optimization, 1(3):127–239, 2014.
  22. A unified convergence analysis of block successive minimization methods for nonsmooth optimization. SIAM J. Optim., 23(2):1126–1153, 2013. doi:10.1137/120891009.
  23. Optimization theory and methods: nonlinear programming, volume 1. Springer Science & Business Media, 2006.
  24. A formal proof of pac learnability for decision stumps. In Proceedings of the 10th ACM SIGPLAN International Conference on Certified Programs and Proofs, pages 5–17, 2021.
  25. Global convergence of admm in nonconvex nonsmooth optimization. Journal of Scientific Computing, 78:29–63, 2019.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.