Exponential Convergence of Augmented Primal-dual Gradient Algorithms for Partially Strongly Convex Functions (2410.02192v3)
Abstract: We show that the augmented primal-dual gradient algorithms can achieve global exponential convergence with partially strongly convex functions. In particular, the objective function only needs to be strongly convex in the subspace satisfying the equality constraint and can be generally convex elsewhere, provided the global Lipschitz condition for the gradient is satisfied. This condition implies that states outside the equality subspace will converge towards it exponentially fast. The analysis is then applied to distributed optimization, where the partially strong convexity can be relaxed to the restricted secant inequality condition, which is not necessarily convex. This work unifies global exponential convergence results for some existing centralized and distributed algorithms.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.