Generalizing the optimized gradient method for smooth convex minimization (1607.06764v4)
Abstract: This paper generalizes the optimized gradient method (OGM) that achieves the optimal worst-case cost function bound of first-order methods for smooth convex minimization. Specifically, this paper studies a generalized formulation of OGM and analyzes its worst-case rates in terms of both the function value and the norm of the function gradient. This paper also develops a new algorithm called OGM-OG that is in the generalized family of OGM and that has the best known analytical worst-case bound with rate $O(1/N{1.5})$ on the decrease of the gradient norm among fixed-step first-order methods. This paper also proves that Nesterov's fast gradient method has an $O(1/N{1.5})$ worst-case gradient norm rate but with constant larger than OGM-OG. The proof is based on the worst-case analysis called Performance Estimation Problem.