LeAP-SSN: A Semismooth Newton Method with Global Convergence Rates (2508.16468v1)
Abstract: We propose LeAP-SSN (Levenberg--Marquardt Adaptive Proximal Semismooth Newton method), a semismooth Newton-type method with a simple, parameter-free globalisation strategy that guarantees convergence from arbitrary starting points in nonconvex settings to stationary points, and under a Polyak--Lojasiewicz condition, to a global minimum, in Hilbert spaces. The method employs an adaptive Levenberg--Marquardt regularisation for the Newton steps, combined with backtracking, and does not require knowledge of problem-specific constants. We establish global nonasymptotic rates: $\mathcal{O}(1/k)$ for convex problems in terms of objective values, $\mathcal{O}(1/\sqrt{k})$ under nonconvexity in terms of subgradients, and linear convergence under a Polyak--Lojasiewicz condition. The algorithm achieves superlinear convergence under mild semismoothness and Dennis--Mor\'e or partial smoothness conditions, even for non-isolated minimisers. By combining strong global guarantees with superlinear local rates in a fully parameter-agnostic framework, LeAP-SSN bridges the gap between globally convergent algorithms and the fast asymptotics of Newton's method. The practical efficiency of the method is illustrated on representative problems from imaging, contact mechanics, and machine learning.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.