Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

An accelerated proximal PRS-SQP algorithm with dual ascent-descent procedures for smooth composite optimization (2505.09078v1)

Published 14 May 2025 in math.OC

Abstract: Conventional wisdom in composite optimization suggests augmented Lagrangian dual ascent (ALDA) in Peaceman-Rachford splitting (PRS) methods for dual feasibility. However, ALDA may fail when the primal iterate is a local minimum, a stationary point, or a coordinatewise solution of the highly nonconvex augmented Lagrangian function. Splitting sequential quadratic programming (SQP) methods utilize augmented Lagrangian dual descent (ALDD) to directly minimize the primal residual, circumventing the limitations of ALDA and achieving faster convergence in smooth optimization. This paper aims to present a fairly accessible generalization of two contrasting dual updates, ALDA and ALDD, for smooth composite optimization. A key feature of our PRS-SQP algorithm is its dual ascent-descent procedure, which provides a free direction rule for the dual updates and a new insight to explain the counterintuitive convergence behavior. Furthermore, we incorporate a hybrid acceleration technique that combines inertial extrapolation and back substitution to improve convergence. Theoretically, we establish the feasibility for a wider range of acceleration factors than previously known and derive convergence rates within the Kurdyka- Lojasiewicz framework. Numerical experiments validate the effectiveness and stability of the proposed method in various dual-update scenarios.

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com