Rate of convergence of the Nesterov accelerated gradient method in the subcritical case $α\leq 3$
Abstract: In a Hilbert space setting $\mathcal H$, given $\Phi: \mathcal H \to \mathbb R$ a convex continuously differentiable function, and $\alpha$ a positive parameter, we consider the inertial system with Asymptotic Vanishing Damping \begin{equation*} \mbox{(AVD)}{\alpha} \quad \quad \ddot{x}(t) + \frac{\alpha}{t} \dot{x}(t) + \nabla \Phi (x(t)) =0. \end{equation*} Depending on the value of $ \alpha $ with respect to 3, we give a complete picture of the convergence properties as $t \to + \infty$ of the trajectories generated by $\mbox{(AVD)}{\alpha}$, as well as iterations of the corresponding algorithms. Our main result concerns the subcritical case $\alpha \leq 3$, where we show that $\Phi (x(t))-\min \Phi = \mathcal O (t{-\frac{2}{3}\alpha})$. Then we examine the convergence of trajectories to optimal solutions. As a new result, in the one-dimensional framework, for the critical value $\alpha = 3 $, we prove the convergence of the trajectories without any restrictive hypothesis on the convex function $\Phi $. In the second part of this paper, we study the convergence properties of the associated forward-backward inertial algorithms. They aim to solve structured convex minimization problems of the form $\min \left\lbrace \Theta:= \Phi + \Psi \right\rbrace$, with $\Phi$ smooth and $\Psi$ nonsmooth. The continuous dynamics serves as a guideline for this study. We obtain a similar rate of convergence for the sequence of iterates $(x_k)$: for $\alpha \leq 3$ we have $\Theta (x_k)-\min \Theta = \mathcal O (k{-p})$ for all $p <\frac{2\alpha}{3}$ , and for $\alpha > 3$ \ $\Theta (x_k)-\min \Theta = o (k{-2})$. We conclude this study by showing that the results are robust with respect to external perturbations.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.