Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash 90 tok/s
Gemini 2.5 Pro 41 tok/s Pro
GPT-5 Medium 20 tok/s
GPT-5 High 23 tok/s Pro
GPT-4o 93 tok/s
GPT OSS 120B 441 tok/s Pro
Kimi K2 212 tok/s Pro
2000 character limit reached

Descent Properties of an Anderson Accelerated Gradient Method With Restarting (2206.01372v2)

Published 3 Jun 2022 in math.OC

Abstract: Anderson Acceleration (AA) is a popular acceleration technique to enhance the convergence of fixed-point iterations. The analysis of AA approaches typically focuses on the convergence behavior of a corresponding fixed-point residual, while the behavior of the underlying objective function values along the accelerated iterates is currently not well understood. In this paper, we investigate local properties of AA with restarting applied to a basic gradient scheme in terms of function values. Specifically, we show that AA with restarting is a local descent method and that it can decrease the objective function faster than the gradient method. These new results theoretically support the good numerical performance of AA when heuristic descent conditions are used for globalization and they provide a novel perspective on the convergence analysis of AA that is more amenable to nonconvex optimization problems. Numerical experiments are conducted to illustrate our theoretical findings.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Ai Generate Text Spark Streamline Icon: https://streamlinehq.com

Paper Prompts

Sign up for free to create and run prompts on this paper using GPT-5.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.