Papers
Topics
Authors
Recent
2000 character limit reached

A Single-loop Stochastic Riemannian ADMM for Nonsmooth Optimization

Published 28 Dec 2025 in math.OC | (2512.22750v1)

Abstract: We study a class of nonsmooth stochastic optimization problems on Riemannian manifolds. In this work, we propose MARS-ADMM, the first stochastic Riemannian alternating direction method of multipliers with provable near-optimal complexity guarantees. Our algorithm incorporates a momentum-based variance-reduced gradient estimator applied exclusively to the smooth component of the objective, together with carefully designed penalty parameter and dual stepsize updates. Unlike existing approaches that rely on computationally expensive double-loop frameworks, MARS-ADMM operates in a single-loop fashion and requires only a constant number of stochastic gradient evaluations per iteration. Under mild assumptions, we establish that MARS-ADMM achieves an iteration complexity of (\tilde{\mathcal{O}}(\varepsilon{-3})), which improves upon the previously best-known bound of (\mathcal{O}(\varepsilon{-3.5})) for stochastic Riemannian operator-splitting methods. As a result, our analysis closes the theoretical complexity gap between stochastic Riemannian operator-splitting algorithms and stochastic methods for nonsmooth optimization with nonlinear constraints. Notably, the obtained complexity also matches the best-known bounds in deterministic nonsmooth Riemannian optimization, demonstrating that deterministic-level accuracy can be achieved using only constant-size stochastic samples.

Summary

We haven't generated a summary for this paper yet.

Whiteboard

Paper to Video (Beta)

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.