Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 76 tok/s
Gemini 2.5 Pro 58 tok/s Pro
GPT-5 Medium 26 tok/s Pro
GPT-5 High 25 tok/s Pro
GPT-4o 81 tok/s Pro
Kimi K2 206 tok/s Pro
GPT OSS 120B 465 tok/s Pro
Claude Sonnet 4 35 tok/s Pro
2000 character limit reached

Sufficient Descent Riemannian Conjugate Gradient Method (2009.01451v2)

Published 3 Sep 2020 in math.OC

Abstract: This paper considers sufficient descent Riemannian conjugate gradient methods with line search algorithms. We propose two kinds of sufficient descent nonlinear conjugate gradient methods and prove these methods satisfy the sufficient descent condition even on Riemannian manifolds. One is the hybrid method combining the Fletcher-Reeves-type method with the Polak-Ribiere-Polyak-type method, and the other is the Hager-Zhang-type method, both of which are generalizations of those used in Euclidean space. Also, we generalize two kinds of line search algorithms that are widely used in Euclidean space. In addition, we numerically compare our generalized methods by solving several Riemannian optimization problems. The results show that the performance of the proposed hybrid method greatly depends regardless of the type of line search used. Meanwhile, the Hager-Zhang-type method has the fast convergence property regardless of the type of line search used.

Summary

We haven't generated a summary for this paper yet.

Lightbulb On Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.