Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
139 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Proximal Alternating Direction Method of Multiplier for Linearly Constrained Nonconvex Minimization (1812.10229v4)

Published 26 Dec 2018 in math.OC

Abstract: Consider the minimization of a nonconvex differentiable function over a polyhedron. A popular primal-dual first-order method for this problem is to perform a gradient projection iteration for the augmented Lagrangian function and then update the dual multiplier vector using the constraint residual. However, numerical examples show that this approach can exhibit "oscillation" and may not converge. In this paper, we propose a proximal alternating direction method of multipliers for the multi-block version of this problem. A distinctive feature of this method is the introduction of a "smoothed" (i.e., exponentially weighted) sequence of primal iterates, and the inclusion, at each iteration, to the augmented Lagrangian function a quadratic proximal term centered at the current smoothed primal iterate. The resulting proximal augmented Lagrangian function is inexactly minimized (via a gradient projection step) at each iteration while the dual multiplier vector is updated using the residual of the linear constraints. When the primal and dual stepsizes are chosen sufficiently small, we show that suitable "smoothing" can stabilize the "oscillation", and the iterates of the new proximal ADMM algorithm converge to a stationary point under some mild regularity conditions. Furthermore, when the objective function is quadratic, we establish the linear convergence of the algorithm. Our proof is based on a new potential function and a novel use of error bounds.

Summary

We haven't generated a summary for this paper yet.