Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
156 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Complexity of Finding Stationary Points of Nonsmooth Nonconvex Functions (2002.04130v3)

Published 10 Feb 2020 in math.OC and cs.LG

Abstract: We provide the first non-asymptotic analysis for finding stationary points of nonsmooth, nonconvex functions. In particular, we study the class of Hadamard semi-differentiable functions, perhaps the largest class of nonsmooth functions for which the chain rule of calculus holds. This class contains examples such as ReLU neural networks and others with non-differentiable activation functions. We first show that finding an $\epsilon$-stationary point with first-order methods is impossible in finite time. We then introduce the notion of $(\delta, \epsilon)$-stationarity, which allows for an $\epsilon$-approximate gradient to be the convex combination of generalized gradients evaluated at points within distance $\delta$ to the solution. We propose a series of randomized first-order methods and analyze their complexity of finding a $(\delta, \epsilon)$-stationary point. Furthermore, we provide a lower bound and show that our stochastic algorithm has min-max optimal dependence on $\delta$. Empirically, our methods perform well for training ReLU neural networks.

Citations (42)

Summary

We haven't generated a summary for this paper yet.