Implicit augmented Lagrangian and generalized optimization (2302.00363v2)
Abstract: Generalized nonlinear programming is considered without any convexity assumption, capturing a variety of problems that include nonsmooth objectives, combinatorial structures, and set-membership nonlinear constraints. We extend the augmented Lagrangian framework to this broad problem class, preserving an implicit formulation and introducing slack variables merely as a formal device. This, however, gives rise to a generalized augmented Lagrangian function that lacks regularity, due to the marginalization with respect to slack variables. Based on parametric optimization, we develop a tailored stationarity concept to better qualify the iterates, generated as approximate solutions to a sequence of subproblems. Using this variational characterization and the lifted representation, a suitable multiplier update rule is derived, and then asymptotic properties and convergence guarantees are established for a safeguarded augmented Lagrangian scheme. An illustrative numerical example showcases the modelling versatility gained by dropping convexity assumptions and indicates the practical benefits of the advocated implicit approach.