The Nesterov-Spokoiny Acceleration Achieves Strict $o(1/k^2)$ Convergence
Abstract: We study a variant of an accelerated gradient algorithm of Nesterov and Spokoiny. We call this algorithm the Nesterov--Spokoiny Acceleration (NSA). The NSA algorithm simultaneously satisfies the following property: For a smooth convex objective $f \in \mathscr{F}{L}{\infty,1} (\mathbb{R}n) $, the sequence ${ \mathbf{x}_k }{k \in \mathbb{N}}$ governed by NSA satisfies $ \limsup\limits_{k \to \infty } k2 ( f (\mathbf{x}_k ) - f* ) = 0 $, where $f* > -\infty$ is the minimum of $f$.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.