An adaptive proximal safeguarded augmented Lagrangian method for nonsmooth DC problems with convex constraints (2505.15369v1)
Abstract: A proximal safeguarded augmented Lagrangian method for minimizing the difference of convex (DC) functions over a nonempty, closed and convex set with additional linear equality as well as convex inequality constraints is presented. Thereby, all functions involved may be nonsmooth. Iterates (of the primal variable) are obtained by solving convex optimization problems as the concave part of the objective function gets approximated by an affine linearization. Under the assumption of a modified Slater constraint qualification, both convergence of the primal and dual variables to a generalized Karush-Kuhn-Tucker (KKT) point is proven, at least on a subsequence. Numerical experiments and comparison with existing solution methods are presented using some classes of constrained and nonsmooth DC problems.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.