On Relatively Smooth Optimization over Riemannian Manifolds (2508.03048v1)
Abstract: We study optimization over Riemannian embedded submanifolds, where the objective function is relatively smooth in the ambient Euclidean space. Such problems have broad applications but are still largely unexplored. We introduce two Riemannian first-order methods, namely the retraction-based and projection-based Riemannian Bregman gradient methods, by incorporating the Bregman distance into the update steps. The retraction-based method can handle nonsmooth optimization; at each iteration, the update direction is generated by solving a convex optimization subproblem constrained to the tangent space. We show that when the reference function is of the quartic form $h(x) = \frac{1}{4}|x|4 + \frac{1}{2}|x|2$, the constraint subproblem admits a closed-form solution. The projection-based approach can be applied to smooth Riemannian optimization, which solves an unconstrained subproblem in the ambient Euclidean space. Both methods are shown to achieve an iteration complexity of $\mathcal{O}(1/\epsilon2)$ for finding an $\epsilon$-approximate Riemannian stationary point. When the manifold is compact, we further develop stochastic variants and establish a sample complexity of $\mathcal{O}(1/\epsilon4)$. Numerical experiments on the nonlinear eigenvalue problem and low-rank quadratic sensing problem demonstrate the advantages of the proposed methods.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.