Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
156 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Nonlinear preconditioned primal-dual method for a class of structured minimax problems (2401.05143v1)

Published 10 Jan 2024 in math.OC, cs.NA, and math.NA

Abstract: We propose and analyze a general framework called nonlinear preconditioned primal-dual with projection for solving nonconvex-nonconcave and non-smooth saddle-point problems. The framework consists of two steps. The first is a nonlinear preconditioned map followed by a relaxed projection onto the separating hyperspace we construct. One key to the method is the selection of preconditioned operators, which tailors to the structure of the saddle-point problem and is allowed to be nonlinear and asymmetric. The other is the construction of separating hyperspace, which guarantees fast convergence. This framework paves the way for constructing nonlinear preconditioned primal-dual algorithms. We show that weak convergence, and so is sublinear convergence under the assumption of the convexity of saddle-point problems and linear convergence under a metric subregularity. We also show that many existing primal-daul methods, such as the generalized primal-dual algorithm method, are special cases of relaxed preconditioned primal-dual with projection.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (33)
  1. A first-order primal-dual algorithm for convex problems with applications to imaging. Journal of mathematical imaging and vision, 40:120–145, 2011.
  2. On the ergodic convergence rates of a first-order primal–dual algorithm. Mathematical Programming, 159(1-2):253–287, 2016.
  3. A general framework for a class of first order primal-dual algorithms for convex optimization in imaging science. SIAM Journal on Imaging Sciences, 3(4):1015–1046, 2010.
  4. Optimization reformulations of the generalized nash equilibrium problem using nikaido-isoda-type functions. Computational Optimization and Applications, 43:353–377, 2009.
  5. Relaxation algorithms to find nash equilibria with economic applications. Environmental Modeling & Assessment, 5:63–73, 2000.
  6. Optimal proximal augmented lagrangian method and its application to full jacobian splitting for multi-block separable convex minimization problems. IMA Journal of Numerical Analysis, 40(2):1188–1216, 2020.
  7. An efficient primal-dual hybrid gradient algorithm for total variation image restoration. Ucla Cam Report, 34:8–34, 2008.
  8. L. Hurwicz K. J. Arrow and H. Uzawa. Studies in linear and non-linear programming. Stanford Mathematical Studies in the Social Sciences, II, Stanford University Press, Stanford, CA, 1958.
  9. Diagonal preconditioning for first order primal-dual algorithms in convex optimization. In 2011 International Conference on Computer Vision, pages 1762–1769. IEEE, 2011.
  10. L. M. Bregman. The relaxation method of finding the common point of convex sets and its application to the solution of problems in convex programming*1. USSR Computational Mathematics and Mathematical Physics, pages 200–217, 1967.
  11. On a unified and simplified proof for the ergodic convergence rates of ppm, pdhg and admm. arXiv preprint arXiv:2305.02165, 2023.
  12. Convergence analysis of primal-dual algorithms for a saddle-point problem: from contraction perspective. SIAM Journal on Imaging Sciences, 5(1):119–149, 2012.
  13. A first-order primal-dual algorithm with linesearch. SIAM Journal on Optimization, 28(1):411–432, 2018.
  14. A generalized primal-dual algorithm with improved convergence condition for saddle point problems. SIAM Journal on Imaging Sciences, 15(3):1157–1183, 2022.
  15. Warped proximal iterations for monotone inclusions. Journal of Mathematical Analysis and Applications, 491(1):124315, 2020.
  16. Acceleration of primal–dual methods by preconditioning and simple subproblem procedures. Journal of Scientific Computing, 86(2):21, 2021.
  17. Preconditioned primal-dual gradient methods for nonconvex composite and finite-sum optimization. arXiv preprint arXiv:2309.13416, 2023.
  18. Semi-anchored multi-step gradient descent ascent method for structured nonconvex-nonconcave composite minimax problems. arXiv preprint arXiv:2105.15042, 2021.
  19. Primal–dual proximal splitting and generalized conjugation in non-smooth non-convex optimization. Applied Mathematics & Optimization, 84(2):1239–1284, 2021.
  20. Patrick L Combettes. Fejér-monotonicity in convex optimization. Encyclopedia of optimization, 2:106–114, 2001.
  21. Revisiting linearized bregman iterations under lipschitz-like convexity condition. Mathematics of Computation, 92(340):779–803, 2023.
  22. Convex analysis and monotone operator theory in Hilbert spaces, volume 408. Springer, 2011.
  23. Frank H Clarke. Optimization and nonsmooth analysis. SIAM, 1990.
  24. Tuomo Valkonen. First-order primal–dual methods for nonsmooth non-convex optimisation. Handbook of Mathematical Models and Algorithms in Computer Vision and Imaging: Mathematical Imaging and Vision, pages 1–42, 2021.
  25. A descent lemma beyond lipschitz gradient continuity: first-order methods revisited and applications. Mathematics of Operations Research, 42(2):330–348, 2017.
  26. Combined relaxation method for mixed equilibrium problems. Journal of Optimization Theory and Applications, 126:309–322, 2005.
  27. Combined relaxation methods for generalized monotone variational inequalities. In Generalized convexity and related topics, pages 3–31. Springer, 2006.
  28. Pontus Giselsson. Nonlinear forward-backward splitting with projection correction. SIAM Journal on Optimization, 31(3):2199–2226, 2021.
  29. Regularity and conditioning of solution mappings in variational analysis. Set-Valued Analysis, 12:79–109, 2004.
  30. Implicit functions and solution mappings: A view from variational analysis, volume 616. Springer, 2009.
  31. Asymmetric forward–backward–adjoint splitting for solving monotone inclusions involving three operators. Computational Optimization and Applications, 68:57–93, 2017.
  32. Bregman forward-backward operator splitting. Set-Valued and Variational Analysis, 29:583–603, 2021.
  33. General resolvents for monotone operators: characterization and extension. arXiv preprint arXiv:0810.3905, 2008.

Summary

We haven't generated a summary for this paper yet.