Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
162 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Convergence Analysis of the Alternating Anderson-Picard Method for Nonlinear Fixed-point Problems (2407.10472v1)

Published 15 Jul 2024 in math.NA and cs.NA

Abstract: Anderson Acceleration (AA) has been widely used to solve nonlinear fixed-point problems due to its rapid convergence. This work focuses on a variant of AA in which multiple Picard iterations are performed between each AA step, referred to as the Alternating Anderson-Picard (AAP) method. Despite introducing more 'slow' Picard iterations, this method has been shown to be efficient and even more robust in both linear and nonlinear cases. However, there is a lack of theoretical analysis for AAP in the nonlinear case, which this paper aims to address. We show the equivalence between AAP and a multisecant-GMRES method that uses GMRES to solve a multisecant linear system at each iteration. More interestingly, the incorporation of Picard iterations and AA establishes a deep connection between AAP and the Newton-GMRES method. This connection is evident in terms of the multisecant matrix, the approximate Jacobian inverse, search direction, and optimization gain -- an essential factor in the convergence analysis of AA. We show that these terms converge to their corresponding terms in the Newton-GMRES method as the residual approaches zero. Consequently, we build the convergence analysis of AAP. To validate our theoretical findings, numerical examples are provided.

Citations (1)

Summary

We haven't generated a summary for this paper yet.