Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
12 tokens/sec
Gemini 2.5 Pro Pro
47 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
4 tokens/sec
DeepSeek R1 via Azure Pro
33 tokens/sec
2000 character limit reached

Primal-dual algorithm for weakly convex functions under sharpness conditions (2410.20977v1)

Published 28 Oct 2024 in math.OC, cs.NA, and math.NA

Abstract: We investigate the convergence of the primal-dual algorithm for composite optimization problems when the objective functions are weakly convex. We introduce a modified duality gap function, which is a lower bound of the standard duality gap function. Under the sharpness condition of this new function, we identify the area around the set of saddle points where we obtain the convergence of the primal-dual algorithm. We give numerical examples and applications in image denoising and deblurring to demonstrate our results.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (51)
  1. Faster first-order primal-dual methods for linear programming using restarts and sharpness. Mathematical Programming 201, 133–184.
  2. A unified analysis of descent sequences in weakly convex optimization, including convergence rates for bundle methods. SIAM Journal on Optimization 33, 89–115.
  3. Resolution numerique d’inegalities variationanelles. Acad. Sci. A B, 1063–1066.
  4. The equivalence of three types of error bounds for weakly and approximately convex functions. Journal of Optimization Theory and Applications 194, 220–245.
  5. Convex Analysis and Monotone Operator Theory in Hilbert Spaces. 2nd ed., Springer Publishing Company, Incorporated.
  6. Calculus rules for proximal ε𝜀\varepsilonitalic_ε-subdifferentials and inexact proximity operators for weakly convex functions, in: 2023 European Control Conference (ECC), IEEE. pp. 1–8.
  7. Convergence analysis of an inexact forward-backward algorithm for problems involving weakly convex functions. arXiv preprint arXiv:2303.14021 .
  8. Duality for composite optimization problem within the framework of abstract convexity. Optimization 72, 37–80.
  9. On duality for nonconvex minimization problems within the framework of abstract convexity. Optimization 71, 949–971.
  10. The łojasiewicz inequality for nonsmooth subanalytic functions with applications to subgradient dynamical systems. SIAM Journal on Optimization 17, 1205–1223.
  11. Proximal alternating linearized minimization for nonconvex and nonsmooth problems. Mathematical Programming 146, 459–494. URL: https://hal.inria.fr/hal-00916090, doi:10.1007/s10107-013-0701-9.
  12. Semiconcave functions, Hamilton-Jacobi equations, and optimal control. volume 58. Springer Science & Business Media.
  13. An algorithm for total variation minimization and applications. Journal of Mathematical imaging and vision 20, 89–97.
  14. A first-order primal-dual algorithm for convex problems with applications to imaging. Journal of mathematical imaging and vision 40, 120–145.
  15. On distributed nonconvex optimization: Projected subgradient method for weakly convex problems in networks. IEEE Transactions on Automatic Control 67, 662–675.
  16. Primal–dual proximal splitting and generalized conjugation in non-smooth non-convex optimization. Applied Mathematics & Optimization 84, 1239–1284.
  17. Warpd: A linearly convergent first-order primal-dual algorithm for inverse problems with approximate sharpness conditions. SIAM Journal on Imaging Sciences 15, 1539–1575.
  18. Signal recovery by proximal forward-backward splitting. Multiscale modeling & simulation 4, 1168–1200.
  19. A primal–dual splitting method for convex optimization involving lipschitzian, proximable and linear composite terms. Journal of optimization theory and applications 158, 460–479.
  20. Error bounds revisited. Optimization 71, 1021–1053.
  21. Convergence rate analysis of primal-dual splitting schemes. SIAM Journal on Optimization 25, 1912–1943.
  22. Subgradient methods for sharp weakly convex functions. Journal of Optimization Theory and Applications 179, 962–982.
  23. Nonsmooth optimization using taylor-like models: error bounds, convergence, and termination criteria. Mathematical Programming 185, 357–383.
  24. Error bounds, quadratic growth, and linear convergence of proximal methods. Mathematics of Operations Research 43, 919–948.
  25. Quadratic error bound of the smoothed gap and the restarted averaged primal-dual hybrid gradient. Open Journal of Mathematical Optimization 4, 1–34.
  26. Preconditioned primal-dual gradient methods for nonconvex composite and finite-sum optimization. arXiv preprint arXiv:2309.13416 .
  27. A primal-dual algorithm with line search for general convex-concave saddle point problems. SIAM Journal on Optimization 31, 1299–1329.
  28. On the convergence of primal-dual hybrid gradient algorithm. SIAM Journal on Imaging Sciences 7, 2526–2537.
  29. Convergence analysis of primal-dual algorithms for a saddle-point problem: from contraction perspective. SIAM Journal on Imaging Sciences 5, 119–149.
  30. Faster subgradient methods for functions with hölderian growth. Mathematical Programming 180, 417–450.
  31. Subdifferentiability and subdifferential monotonicity of 1-paraconvex functions. Control and Cybernetics 25.
  32. Nonsmooth nonconvex-nonconcave minimax optimization: Primal-dual balancing and iteration complexity analysis. arXiv preprint arXiv:2209.10825 .
  33. Incremental methods for weakly convex optimization. arXiv preprint arXiv:1907.11687 .
  34. Error bounds, pl condition, and quadratic growth for weakly convex functions, and linear convergences of proximal point methods, in: 6th Annual Learning for Dynamics & Control Conference, PMLR. pp. 993–1005.
  35. Doa estimation in impulsive noise via low-rank matrix approximation and weakly convex optimization. IEEE Transactions on Aerospace and Electronic Systems 55, 3603–3616.
  36. On the infimal sub-differential size of primal-dual hybrid gradient method and beyond. arXiv preprint arXiv:2206.12061 .
  37. Primal-dual block-proximal splitting for a class of non-convex problems. Electron. Trans. Numer. Anal. 52, 509–552. doi:10.1553/etna_vol52s509.
  38. Iterative regularization for convex regularizers, in: International conference on artificial intelligence and statistics, PMLR. pp. 1684–1692.
  39. The primal-dual hybrid gradient method for semiconvex splittings. SIAM Journal on Imaging Sciences 8, 827–857.
  40. The numerical tours of signal processing-advanced computational signal and image processing. IEEE Computing in Science and Engineering 13, 94–97.
  41. Variational analysis. volume 317. Springer Science & Business Media.
  42. On paraconvex multifunctions. Oper. Research Verf.(Methods of Oper Res) 31, 540–546.
  43. Sharpness, restart and acceleration. Advances in Neural Information Processing Systems 30.
  44. Nonlinear total variation based noise removal algorithms. Physica D: nonlinear phenomena 60, 259–268.
  45. Nonconvex sparse logistic regression with weakly convex regularization. IEEE Transactions on Signal Processing 66, 3199–3211.
  46. Precompact convergence of the nonconvex primal–dual hybrid gradient algorithm. Journal of Computational and Applied Mathematics 330, 15–27.
  47. A primal-dual algorithmic framework for constrained convex minimization. arXiv preprint arXiv:1406.5403 .
  48. A primal–dual hybrid gradient method for nonlinear operators with applications to mri. Inverse Problems 30, 055012.
  49. A splitting algorithm for dual monotone inclusions involving cocoercive operators. Advances in Computational Mathematics 38, 667–681.
  50. Computational guarantees for restarted pdhg for lp based on" limiting error ratios" and lp sharpness. arXiv preprint arXiv:2312.14774 .
  51. A first-order primal-dual method for nonconvex constrained optimization based on the augmented lagrangian. Mathematics of Operations Research 49, 125–150.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.