Convergence of stochastic AGD under weak growth or stochastic line-search
Establish convergence guarantees for stochastic Nesterov accelerated gradient descent (stochastic AGD) when only relaxed assumptions hold—specifically, prove convergence under the weak growth condition (e.g., E[||∇f_i(w)||^2] ≤ 2αL(f(w) − f(w*)) for some α) or when step sizes are chosen via a stochastic line-search—in contrast to the strong growth condition setting analyzed in this work.
References
For example, the convergence of stochastic AGD under relaxed conditions, like weak growth or with a stochastic line-search \citep{vaswani2019fast}, has not been proved.
                — Faster Convergence of Stochastic Accelerated Gradient Descent under Interpolation
                
                (2404.02378 - Mishkin et al., 3 Apr 2024) in Conclusion, final paragraph