Papers
Topics
Authors
Recent
2000 character limit reached

An Optimal Convergence Rate for the Gaussian Regularized Shannon Sampling Series (1711.04909v4)

Published 14 Nov 2017 in math.NA

Abstract: We consider the reconstruction of a bandlimited function from its finite localized sample data. Truncating the classical Shannon sampling series results in an unsatisfactory convergence rate due to the slow decay of the sinc function. To overcome this drawback, a simple and highly effective method, called the Gaussian regularization of the Shannon series, was proposed in the engineering and has received remarkable attention. It works by multiplying the sinc function in the Shannon series with a regularized Gaussian function. Recently, it was proved that the upper error bound of this method can achieve a convergence rate of the order $O(\frac{1}{\sqrt{n}}\exp(-\frac{\pi-\delta}{2}n))$, where $0<\delta<\pi$ is the bandwidth and $n$ the number of sample data. The convergence rate is by far the best convergence rate among all regularized methods for the Shannon sampling series. The main objective of this article is to present the theoretical justification and numerical verification that the convergence rate is optimal when $0<\delta<\pi/2$ by estimating the lower error bound of the truncated Gaussian regularized Shannon sampling series.

Summary

We haven't generated a summary for this paper yet.

Whiteboard

Paper to Video (Beta)

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

Collections

Sign up for free to add this paper to one or more collections.