Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
194 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Central limit theorems for stochastic gradient descent with averaging for stable manifolds (1912.09187v1)

Published 19 Dec 2019 in math.PR, math.ST, and stat.TH

Abstract: In this article we establish new central limit theorems for Ruppert-Polyak averaged stochastic gradient descent schemes. Compared to previous work we do not assume that convergence occurs to an isolated attractor but instead allow convergence to a stable manifold. On the stable manifold the target function is constant and the oscillations in the tangential direction may be significantly larger than the ones in the normal direction. As we show, one still recovers a central limit theorem with the same rates as in the case of isolated attractors. Here we consider step-sizes $\gamma_n=n{-\gamma}$ with $\gamma\in(\frac34,1)$, typically.

Summary

We haven't generated a summary for this paper yet.