Stable extrapolation of analytic functions (1605.09601v1)
Abstract: This paper examines the problem of extrapolation of an analytic function for $x > 1$ given perturbed samples from an equally spaced grid on $[-1,1]$. Mathematical folklore states that extrapolation is in general hopelessly ill-conditioned, but we show that a more precise statement carries an interesting nuance. For a function $f$ on $[-1,1]$ that is analytic in a Bernstein ellipse with parameter $\rho > 1$, and for a uniform perturbation level $\epsilon$ on the function samples, we construct an asymptotically best extrapolant $e(x)$ as a least squares polynomial approximant of degree $M*$ given explicitly. We show that the extrapolant $e(x)$ converges to $f(x)$ pointwise in the interval $I_\rho\in[1,(\rho+\rho{-1})/2)$ as $\epsilon \to 0$, at a rate given by a $x$-dependent fractional power of $\epsilon$. More precisely, for each $x \in I_{\rho}$ we have [ |f(x) - e(x)| = \mathcal{O}\left( \epsilon{-\log r(x) / \log\rho} \right), \qquad\qquad r(x) = \frac{x+\sqrt{x2-1}}{\rho}, ] up to log factors, provided that the oversampling conditioning is satisfied. That is, [ M* \leq \frac{1}{2} \sqrt{N}, ] which is known to be needed from approximation theory. In short, extrapolation enjoys a weak form of stability, up to a fraction of the characteristic smoothness length. The number of function samples, $N+1$, does not bear on the size of the extrapolation error provided that it obeys the oversampling condition. We also show that one cannot construct an asymptotically more accurate extrapolant from $N+1$ equally spaced samples than $e(x)$, using any other linear or nonlinear procedure. The proofs involve original statements on the stability of polynomial approximation in the Chebyshev basis from equally spaced samples and these are expected to be of independent interest.