Gradient Method for Optimization on Riemannian Manifolds with Lower Bounded Curvature (1806.02694v1)
Abstract: The gradient method for minimize a differentiable convex function on Riemannian manifolds with lower bounded sectional curvature is analyzed in this paper. The analysis of the method is presented with three different finite procedures for determining the stepsize, namely, Lipschitz stepsize, adaptive stepsize and Armijo's stepsize. The first procedure requires that the objective function has Lipschitz continuous gradient, which is not necessary for the other approaches. Convergence of the whole sequence to a minimizer, without any level set boundedness assumption, is proved. Iteration-complexity bound for functions with Lipschitz continuous gradient is also presented. Numerical experiments are provided to illustrate the effectiveness of the method in this new setting and certify the obtained theoretical results. In particular, we consider the problem of finding the Riemannian center of mass and the so-called Karcher's mean. Our numerical experiences indicate that the adaptive stepsize is a promising scheme that is worth considering.