Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
149 tokens/sec
GPT-4o
9 tokens/sec
Gemini 2.5 Pro Pro
47 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Preconditioning ideas for the Augmented Lagrangian method (1702.07196v1)

Published 23 Feb 2017 in math.OC

Abstract: A preconditioning strategy for the Powell-Hestenes-Rockafellar Augmented Lagrangian method (ALM) is presented. The scheme exploits the structure of the Augmented Lagrangian Hessian. It is a modular preconditioner consisting of two blocks. The first one is associated with the Lagrangian of the objective while the second administers the Jacobian of the constraints and possible low-rank corrections to the Hessian. The proposed updating strategies take advantage of ALM convergence results and avoid frequent refreshing. Constraint administration takes into account complementarity over the Lagrange multipliers and admits relaxation. The preconditioner is designed for problems where constraint quantity is small compared to the search space. A virtue of the scheme is that it is agnostic to the preconditioning technique used for the Hessian of the Lagrangian function. The strategy described can be used for linear and nonlinear preconditioning. Numerical experiments report on spectral properties of preconditioned matrices from Matrix Market while some optimization problems where taken from the CUTEst collection. Preliminary results indicate that the proposed scheme could be attractive and further experimentation is encouraged.

Summary

We haven't generated a summary for this paper yet.