Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
133 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Nesterov Acceleration for Riemannian Optimization (2202.02036v1)

Published 4 Feb 2022 in math.OC

Abstract: In this paper, we generalize the Nesterov accelerated gradient (NAG) method to solve Riemannian optimization problems in a computationally tractable manner. The iteration complexity of our algorithm matches that of the NAG method on the Euclidean space when the objective functions are geodesically convex or geodesically strongly convex. To the best of our knowledge, the proposed algorithm is the first fully accelerated method for geodesically convex optimization problems without requiring strong convexity. Our convergence rate analysis exploits novel metric distortion lemmas as well as carefully designed potential functions. We also identify a connection with the continuous-time dynamics for modeling Riemannian acceleration in Alimisis et al. [1] to understand the accelerated convergence of our scheme through the lens of continuous-time flows.

Summary

We haven't generated a summary for this paper yet.