Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
139 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Convergence of inexact descent methods for nonconvex optimization on Riemannian manifolds (1103.4828v1)

Published 24 Mar 2011 in math.NA

Abstract: In this paper we present an abstract convergence analysis of inexact descent methods in Riemannian context for functions satisfying Kurdyka-Lojasiewicz inequality. In particular, without any restrictive assumption about the sign of the sectional curvature of the manifold, we obtain full convergence of a bounded sequence generated by the proximal point method, in the case that the objective function is nonsmooth and nonconvex, and the subproblems are determined by a quasi distance which does not necessarily coincide with the Riemannian distance. Moreover, if the objective function is $C1$ with $L$-Lipschitz gradient, not necessarily convex, but satisfying Kurdyka-Lojasiewicz inequality, full convergence of a bounded sequence generated by the steepest descent method is obtained.

Summary

We haven't generated a summary for this paper yet.