Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
194 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Riemannian Adaptive Regularized Newton Methods with Hölder Continuous Hessians (2309.04052v3)

Published 8 Sep 2023 in math.OC

Abstract: This paper presents strong worst-case iteration and operation complexity guarantees for Riemannian adaptive regularized Newton methods, a unified framework encompassing both Riemannian adaptive regularization (RAR) methods and Riemannian trust region (RTR) methods. We comprehensively characterize the sources of approximation in second-order manifold optimization methods: the objective function's smoothness, retraction's smoothness, and subproblem solver's inexactness. Specifically, for a function with a $\mu$-H\"older continuous Hessian, when equipped with a retraction featuring a $\nu$-H\"older continuous differential and a $\theta$-inexact subproblem solver, both RTR and RAR with $2+\alpha$ regularization (where $\alpha=\min{\mu,\nu,\theta}$) locate an $(\epsilon,\epsilon{\alpha/(1+\alpha)})$-approximate second-order stationary point within at most $O(\epsilon{-(2+\alpha)/(1+\alpha)})$ iterations and at most $\tilde{O}(\epsilon{-(4+3\alpha)/(2(1+\alpha))})$ Hessian-vector products. These complexity results are novel and sharp, and reduce to an iteration complexity of $O(\epsilon{-3/2})$ and an operation complexity of $\tilde{O}(\epsilon{-7/4})$ when $\alpha=1$.

Summary

We haven't generated a summary for this paper yet.