Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
139 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

On a Vectorized Version of a Generalized Richardson Extrapolation Process (1605.02630v3)

Published 9 May 2016 in math.NA and cs.NA

Abstract: Let ${\xx_m}$ be a vector sequence that satisfies $$ \xx_m\sim \sss+\sum\infty_{i=1}\alpha_i \gg_i(m)\quad\text{as $m\to\infty$},$$ $\sss$ being the limit or antilimit of ${\xx_m}$ and ${\gg_i(m)}\infty_{i=1}$ being an asymptotic scale as $m\to\infty$, in the sense that $$\lim_{m\to\infty}\frac{|\gg_{i+1}(m)|}{|\gg_{i}(m)|}=0,\quad i=1,2,\ldots.$$ The vector sequences ${\gg_i(m)}\infty_{m=0}$, $i=1,2,\ldots,$ are known, as well as ${\xx_m}$. In this work, we analyze the convergence and convergence acceleration properties of a vectorized version of the generalized Richardson extrapolation process that is defined via the equations $$ \sumk_{i=1}\braket{\yy,\Delta\gg_{i}(m)}\widetilde{\alpha}_i=\braket{\yy,\Delta\xx_m},\quad n\leq m\leq n+k-1;\quad \sss_{n,k}=\xx_n+\sumk_{i=1}\widetilde{\alpha}i\gg{i}(n),$$ $\sss_{n,k}$ being the approximation to $\sss$. Here $\yy$ is some nonzero vector, $\braket{\cdot\,,\cdot}$ is an inner product, such that $\braket{\alpha\aaa,\beta\bb}=\bar{\alpha}\beta\braket{\aaa,\bb}$, and $\Delta\xx_m=\xx_{m+1}-~\xx_m$ and $\Delta\gg_i(m)=\gg_i(m+1)-\gg_i(m)$. By imposing a minimal number of reasonable additional conditions on the $\gg_i(m)$, we show that the error $\sss_{n,k}-\sss$ has a full asymptotic expansion as $n\to\infty$. We also show that actual convergence acceleration takes place and we provide a complete classification of it.

Summary

We haven't generated a summary for this paper yet.