Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
167 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Convergence Rates of Stochastic Zeroth-order Gradient Descent for Ł ojasiewicz Functions (2210.16997v6)

Published 31 Oct 2022 in math.OC and cs.LG

Abstract: We prove convergence rates of Stochastic Zeroth-order Gradient Descent (SZGD) algorithms for Lojasiewicz functions. The SZGD algorithm iterates as \begin{align*} \mathbf{x}{t+1} = \mathbf{x}_t - \eta_t \widehat{\nabla} f (\mathbf{x}_t), \qquad t = 0,1,2,3,\cdots , \end{align*} where $f$ is the objective function that satisfies the \L ojasiewicz inequality with \L ojasiewicz exponent $\theta$, $\eta_t$ is the step size (learning rate), and $ \widehat{\nabla} f (\mathbf{x}_t) $ is the approximate gradient estimated using zeroth-order information only. Our results show that $ { f (\mathbf{x}_t) - f (\mathbf{x}\infty) }{t \in \mathbb{N} } $ can converge faster than $ { | \mathbf{x}_t - \mathbf{x}\infty | }_{t \in \mathbb{N} }$, regardless of whether the objective $f$ is smooth or nonsmooth.

Summary

We haven't generated a summary for this paper yet.