Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 63 tok/s
Gemini 2.5 Pro 44 tok/s Pro
GPT-5 Medium 31 tok/s Pro
GPT-5 High 32 tok/s Pro
GPT-4o 86 tok/s Pro
Kimi K2 194 tok/s Pro
GPT OSS 120B 445 tok/s Pro
Claude Sonnet 4.5 35 tok/s Pro
2000 character limit reached

A Globalized Semismooth Newton Method for Prox-regular Optimization Problems (2509.05765v1)

Published 6 Sep 2025 in math.OC

Abstract: We are concerned with a class of nonconvex and nonsmooth composite optimization problems, comprising a twice differentiable function and a prox-regular function. We establish a sufficient condition for the proximal mapping of a prox-regular function to be single-valued and locally Lipschitz continuous. By virtue of this property, we propose a hybrid of proximal gradient and semismooth Newton methods for solving these composite optimization problems, which is a globalized semismooth Newton method. The whole sequence is shown to converge to an $L$-stationary point under a Kurdyka-{\L}ojasiewicz exponent assumption. Under an additional error bound condition and some other mild conditions, we prove that the sequence converges to a nonisolated $L$-stationary point at a superlinear convergence rate. Numerical comparison with several existing second order methods reveal that our approach performs comparably well in solving both the $\ell_q(0<q<1)$ quasi-norm regularized problems and the fused zero-norm regularization problems.

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.