Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash 79 tok/s
Gemini 2.5 Pro 49 tok/s Pro
GPT-5 Medium 45 tok/s
GPT-5 High 43 tok/s Pro
GPT-4o 103 tok/s
GPT OSS 120B 475 tok/s Pro
Kimi K2 215 tok/s Pro
2000 character limit reached

Revisiting Randomized Smoothing: Nonsmooth Nonconvex Optimization Beyond Global Lipschitz Continuity (2508.13496v1)

Published 19 Aug 2025 in math.OC

Abstract: Randomized smoothing is a widely adopted technique for optimizing nonsmooth objective functions. However, its efficiency analysis typically relies on global Lipschitz continuity, a condition rarely met in practical applications. To address this limitation, we introduce a new subgradient growth condition that naturally encompasses a wide range of locally Lipschitz functions, with the classical global Lipschitz function as a special case. Under this milder condition, we prove that randomized smoothing yields a differentiable function that satisfies certain generalized smoothness properties. To optimize such functions, we propose novel randomized smoothing gradient algorithms that, with high probability, converge to $(\delta, \epsilon)$-Goldstein stationary points and achieve a sample complexity of $\tilde{\mathcal{O}}(d{5/2}\delta{-1}\epsilon{-4})$. By incorporating variance reduction techniques, we further improve the sample complexity to $\tilde{\mathcal{O}}(d{3/2}\delta{-1}\epsilon{-3})$, matching the optimal $\epsilon$-bound under the global Lipschitz assumption, up to a logarithmic factor. Experimental results validate the effectiveness of our proposed algorithms.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com