Sufficient conditions for non-asymptotic convergence of Riemannian optimisation methods (2212.05972v1)
Abstract: Motivated by energy based analyses for descent methods in the Euclidean setting, we investigate a generalisation of such analyses for descent methods over Riemannian manifolds. In doing so, we find that it is possible to derive curvature-free guarantees for such descent methods. This also enables us to give the first known guarantees for a Riemannian cubic-regularised Newton algorithm over $g$-convex functions, which extends the guarantees by Agarwal et al [2021] for an adaptive Riemannian cubic-regularised Newton algorithm over general non-convex functions. This analysis leads us to study acceleration of Riemannian gradient descent in the $g$-convex setting, and we improve on an existing result by Alimisis et al [2021], albeit with a curvature-dependent rate. Finally, extending the analysis by Ahn and Sra [2020], we attempt to provide some sufficient conditions for the acceleration of Riemannian descent methods in the strongly geodesically convex setting.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.