Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
140 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A restricted memory quasi-Newton bundle method for nonsmooth optimization on Riemannian manifolds (2402.18308v1)

Published 28 Feb 2024 in math.OC

Abstract: In this paper, a restricted memory quasi-Newton bundle method for minimizing a locally Lipschitz function over a Riemannian manifold is proposed, which extends the classical one in Euclidean spaces to the manifold setting. The curvature information of the objective function is approximated by applying the Riemannian version of the quasi-Newton updating formulas. The subgradient aggregation technique is used to avoid solving the time-consuming quadratic programming subproblem when calculating the candidate descent direction. Moreover, a new Riemannian line search procedure is proposed to generate the stepsizes, and the process is finitely terminated under a new version of the Riemannian semismooth assumption. Global convergence of the proposed method is established: if the serious iteration steps are finite, then the last serious iterate is stationary; otherwise, every accumulation point of the serious iteration sequence is stationary. Finally, some preliminary numerical results show that the proposed method is efficient.

Summary

We haven't generated a summary for this paper yet.