Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Stochastic Subgradient Methods with Guaranteed Global Stability in Nonsmooth Nonconvex Optimization (2307.10053v4)

Published 19 Jul 2023 in math.OC, cs.AI, cs.LG, and stat.ML

Abstract: In this paper, we focus on providing convergence guarantees for stochastic subgradient methods in minimizing nonsmooth nonconvex functions. We first investigate the global stability of a general framework for stochastic subgradient methods, where the corresponding differential inclusion admits a coercive Lyapunov function. We prove that, for any sequence of sufficiently small stepsizes and approximation parameters, coupled with sufficiently controlled noises, the iterates are uniformly bounded and asymptotically stabilize around the stable set of its corresponding differential inclusion. Moreover, we develop an improved analysis to apply our proposed framework to establish the global stability of a wide range of stochastic subgradient methods, where the corresponding Lyapunov functions are possibly non-coercive. These theoretical results illustrate the promising potential of our proposed framework for establishing the global stability of various stochastic subgradient methods.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Nachuan Xiao (20 papers)
  2. Xiaoyin Hu (10 papers)
  3. Kim-Chuan Toh (111 papers)
Citations (7)
X Twitter Logo Streamline Icon: https://streamlinehq.com