Papers
Topics
Authors
Recent
Search
2000 character limit reached

Preconditioning ideas for the Augmented Lagrangian method

Published 23 Feb 2017 in math.OC | (1702.07196v1)

Abstract: A preconditioning strategy for the Powell-Hestenes-Rockafellar Augmented Lagrangian method (ALM) is presented. The scheme exploits the structure of the Augmented Lagrangian Hessian. It is a modular preconditioner consisting of two blocks. The first one is associated with the Lagrangian of the objective while the second administers the Jacobian of the constraints and possible low-rank corrections to the Hessian. The proposed updating strategies take advantage of ALM convergence results and avoid frequent refreshing. Constraint administration takes into account complementarity over the Lagrange multipliers and admits relaxation. The preconditioner is designed for problems where constraint quantity is small compared to the search space. A virtue of the scheme is that it is agnostic to the preconditioning technique used for the Hessian of the Lagrangian function. The strategy described can be used for linear and nonlinear preconditioning. Numerical experiments report on spectral properties of preconditioned matrices from Matrix Market while some optimization problems where taken from the CUTEst collection. Preliminary results indicate that the proposed scheme could be attractive and further experimentation is encouraged.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

Collections

Sign up for free to add this paper to one or more collections.