Papers
Topics
Authors
Recent
Search
2000 character limit reached

Nesterov Acceleration for Riemannian Optimization

Published 4 Feb 2022 in math.OC | (2202.02036v1)

Abstract: In this paper, we generalize the Nesterov accelerated gradient (NAG) method to solve Riemannian optimization problems in a computationally tractable manner. The iteration complexity of our algorithm matches that of the NAG method on the Euclidean space when the objective functions are geodesically convex or geodesically strongly convex. To the best of our knowledge, the proposed algorithm is the first fully accelerated method for geodesically convex optimization problems without requiring strong convexity. Our convergence rate analysis exploits novel metric distortion lemmas as well as carefully designed potential functions. We also identify a connection with the continuous-time dynamics for modeling Riemannian acceleration in Alimisis et al. [1] to understand the accelerated convergence of our scheme through the lens of continuous-time flows.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.