Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
134 tokens/sec
GPT-4o
10 tokens/sec
Gemini 2.5 Pro Pro
47 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Preconditioned Primal-Dual Gradient Methods for Nonconvex Composite and Finite-Sum Optimization (2309.13416v2)

Published 23 Sep 2023 in math.OC

Abstract: In this paper, we first introduce a preconditioned primal-dual gradient algorithm based on conjugate duality theory. This algorithm is designed to solve composite optimization problem whose objective function consists of two summands: a continuously differentiable nonconvex function and the composition of a nonsmooth nonconvex function with a linear operator. In contrast to existing nonconvex primal-dual algorithms, our proposed algorithm, through the utilization of conjugate duality, does not require the calculation of proximal mapping of nonconvex functions. Under mild conditions, we prove that any cluster point of the generated sequence is a critical point of the composite optimization problem. In the context of Kurdyka-\L{}ojasiewicz property, we establish global convergence and convergence rates for the iterates. Secondly, for nonconvex finite-sum optimization, we propose a stochastic algorithm that combines the preconditioned primal-dual gradient algorithm with a class of variance reduced stochastic gradient estimators. Almost sure global convergence and expected convergence rates are derived relying on the Kurdyka-\L{}ojasiewicz inequality. Finally, some preliminary numerical results are presented to demonstrate the effectiveness of the proposed algorithms.

Summary

We haven't generated a summary for this paper yet.