Primal Dual Alternating Proximal Gradient Algorithms for Nonsmooth Nonconvex Minimax Problems with Coupled Linear Constraints (2212.04672v4)
Abstract: Nonconvex minimax problems have attracted wide attention in machine learning, signal processing and many other fields in recent years. In this paper, we propose a primal-dual alternating proximal gradient (PDAPG) algorithm for solving nonsmooth nonconvex-(strongly) concave minimax problems with coupled linear constraints, respectively. The iteration complexity of the two algorithms are proved to be $\mathcal{O}\left( \varepsilon {-2} \right)$ (resp. $\mathcal{O}\left( \varepsilon {-4} \right)$) under nonconvex-strongly concave (resp. nonconvex-concave) setting to reach an $\varepsilon$-stationary point. To our knowledge, it is the first algorithm with iteration complexity guarantees for solving the nonconvex minimax problems with coupled linear constraints.
- Böhm A. Solving nonconvex-nonconcave min-max problems exhibiting weak minty solutions. arXiv preprint arXiv:2201.12247, 2022.
- Berger J O. Statistical decision theory and Bayesian analysis. Springer Science & Business Media, 2013.
- Bertsekas D P. Convex optimization theory. Athena Scientific Belmont, 2009.
- Sion M. On general minimax theorems. Pacific Journal of mathematics, 1958,8(1):171-176.