Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Convergence Study for Reduced Rank Extrapolation on Nonlinear Systems (1807.03199v2)

Published 9 Jul 2018 in math.NA and cs.NA

Abstract: Reduced Rank Extrapolation (RRE) is a polynomial type method used to accelerate the convergence of sequences of vectors ${\boldsymbol{x}m}$. It is applied successfully in different disciplines of science and engineering in the solution of large and sparse systems of linear and nonlinear equations of very large dimension. If $\boldsymbol{s}$ is the solution to the system of equations $\boldsymbol{x}=\boldsymbol{f}(\boldsymbol{x})$, first, a vector sequence ${\boldsymbol{x}_m}$ is generated via the fixed-point iterative scheme $\boldsymbol{x}{m+1}=\boldsymbol{f}(\boldsymbol{x}m)$, $m=0,1,\ldots,$ and next, RRE is applied to this sequence to accelerate its convergence. RRE produces approximations $\boldsymbol{s}{n,k}$ to $\boldsymbol{s}$ that are of the form $\boldsymbol{s}{n,k}=\sumk{i=0}\gamma_i\boldsymbol{x}{n+i}$ for some scalars $\gamma_i$ depending (nonlinearly) on $\boldsymbol{x}_n, \boldsymbol{x}{n+1},\ldots,\boldsymbol{x}{n+k+1}$ and satisfying $\sumk{i=0}\gamma_i=1$. The convergence properties of RRE when applied in conjunction with linear $\boldsymbol{f}(\boldsymbol{x})$ have been analyzed in different publications. In this work, we discuss the convergence of the $\boldsymbol{s}_{n,k}$ obtained from RRE with nonlinear $\boldsymbol{f}(\boldsymbol{x})$ (i)\,when $n\to\infty$ with fixed $k$, and (ii)\,in two so-called {\em cycling} modes.

Citations (3)

Summary

We haven't generated a summary for this paper yet.