Papers
Topics
Authors
Recent
Search
2000 character limit reached

A Riemannian conjugate subgradient method for nonconvex and nonsmooth optimization on manifolds

Published 7 Sep 2025 in math.OC | (2509.05947v1)

Abstract: Conjugate gradient (CG) methods are widely acknowledged as efficient for minimizing continuously differentiable functions in Euclidean spaces. In recent years, various CG methods have been extended to Riemannian manifold optimization, but existing Riemannian CG methods are confined to smooth objective functions and cannot handle nonsmooth ones. This paper proposes a Riemannian conjugate subgradient method for a class of nonconvex, nonsmooth optimization problems on manifolds. Specifically, we first select a Riemannian subgradient from the convex hull of two directionally active subgradients. The search direction is then defined as a convex combination of the negative of this subgradient and the previous search direction transported to the current tangent space. Additionally, a Riemannian line search with an interval reduction procedure is integrated to generate an appropriate step size, ensuring the objective function values form a monotonically nonincreasing sequence. We establish the global convergence of the algorithm under mild assumptions. Numerical experiments on three classes of Riemannian optimization problems show that the proposed method takes significantly less computational time than related existing methods. To our knowledge, this is the first CG-type method developed for Riemannian nonsmooth optimization.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.