Nonmonotone subgradient methods based on a local descent lemma (2510.19341v1)
Abstract: The aim of this paper is to extend the context of nonmonotone descent methods to the class of nonsmooth and nonconvex functions called upper-$\mathcal{C}2$, which satisfy a nonsmooth and local version of the descent lemma. Under this assumption, we propose a general subgradient method that performs a nonmonotone linesearch, and we prove subsequential convergence to a stationary point of the optimization problem. Our approach allows us to cover the setting of various subgradient algorithms, including Newton and quasi-Newton methods. In addition, we propose a specification of the general scheme, named Self-adaptive Nonmonotone Subgradient Method (SNSM), which automatically updates the parameters of the linesearch. Particular attention is paid to the minimum sum-of-squares clustering problem, for which we provide a concrete implementation of SNSM. We conclude with some numerical experiments where we exhibit the advantages of SNSM in comparison with some known algorithms.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.