Papers
Topics
Authors
Recent
2000 character limit reached

Shannon-Type Sampling: Theory & Extensions

Updated 31 December 2025
  • Shannon-type sampling is a mathematical framework for reconstructing bandlimited signals from both uniform and non-uniform samples based on the classical Shannon theorem.
  • Regularized sampling using window functions improves convergence rates and robustness against noise by enabling exponential error decay.
  • Extensions include non-uniform, derivative, and sparse sampling methods adapted to complex domains such as Riemannian manifolds and operator settings.

Shannon-type sampling refers to the family of mathematical frameworks, methods, and theoretical results governing the recovery and interpolation of signals or functions from discrete samples, subject to a priori constraints on their spectral or structural content. The classical Shannon sampling theorem, which enables perfect reconstruction of bandlimited signals from uniform samples, forms the conceptual and technical foundation. Subsequent developments extend this paradigm to regularized sampling, generalizations to higher dimensions, operator and manifold versions, settings with nonlinear or quantized measurements, sparse or random sampling, and cases involving non-decaying or generalized functions. The interplay between sampling, approximation rates, robustness to noise, and the intrinsic geometry of data has led to a spectrum of rigorous results and practical algorithms.

1. The Classical Shannon Sampling Theorem and Core Principles

The canonical result—the Shannon–Whittaker–Kotelnikov sampling theorem—states that a real-valued signal x(t)x(t) bandlimited to [B,B][-B,B] (i.e., x^(f)=0\hat{x}(f)=0 for f>B|f|>B) is uniquely determined by, and can be reconstructed from, its samples at uniform interval T1/(2B)T\leq 1/(2B) by the formula

x(t)=nZx(nT)sinc(tnTT),sinc(u)=sin(πu)πu,x(t) = \sum_{n\in\mathbb{Z}} x(nT) \, \mathrm{sinc}\left(\frac{t-nT}{T}\right), \quad \mathrm{sinc}(u) = \frac{\sin(\pi u)}{\pi u},

with convergence in L2L^2 and uniformly if xx is continuous and square-integrable (Fujikawa et al., 2015, Javanmard et al., 2012, Chen et al., 2013, Kircheis et al., 2024).

Sampling at fs=1/Tf_s = 1/T above the Nyquist rate ($2B$) avoids aliasing. The mathematical reason for this result lies in the Paley–Wiener theorem and properties of entire functions of exponential type, providing a reconstructive, cardinal series for any bandlimited function.

The theorem admits generalizations to settings with multivariate domains, operator-valued functions, and signals defined on non-Euclidean spaces (0809.5153, Pesenson, 2017, Pesenson, 2013).

The framework is fundamentally tied to the time–frequency uncertainty principle, as the sampling interval and the bandwidth product determines the possibility of faithful reconstruction (Fujikawa et al., 2015).

2. Error, Robustness, and Regularization: The Need for Windowed and Localized Sampling

While the theoretical cardinal series ensures perfect recovery under idealized hypotheses, practical use is impeded by slow decay of the sinc kernel—O(1/t)O(1/|t|)—causing truncated series to converge slowly (at best O(1/N)O(1/\sqrt{N}) in the maximum norm) (Kircheis et al., 2024, Kircheis et al., 2023, Kircheis et al., 2022, Lin et al., 2016). Additionally, sinc-truncation is highly sensitive to noise; in the presence of bounded errors ϵn\epsilon_n at the sample points, the reconstruction error can grow like SNϵ\|S_N\|\, \epsilon with SN(2/π)lnN\|S_N\| \sim (2/\pi)\ln N as NN\to\infty (Kircheis et al., 2023).

To overcome these deficits, Shannon-type sampling is enhanced by regularization, leveraging window functions that decay rapidly or feature compact support. The general form is

fN(t)=nNf(nT)sinc(tnTT)w(tnT;α),f_N(t) = \sum_{|n| \leq N} f(nT)\, \mathrm{sinc}\left(\frac{t-nT}{T}\right) w(t-nT; \alpha),

where w(;α)w(\cdot;\alpha) may be a Gaussian, B-spline, sinh-type, or continuous Kaiser–Bessel window (Kircheis et al., 2024, Kircheis et al., 2022, Filbir et al., 2023). Proper tuning of the window's parameters and oversampling (TT smaller than the Nyquist interval) ensures that the uniform error decays exponentially with NN:

  • Gaussian window gives EN(G)=O(N1/2eN(πδ)/2)E_N^{(G)} = O(N^{-1/2}e^{-N(\pi - \delta)/2}),
  • Sinh-type or continuous Kaiser–Bessel window achieves EN=O(eN(πδ))E_N = O(e^{-N(\pi - \delta)}),
  • B-spline window allows for superalgebraic decay by adjusting its order.

Table: Decay rates for regularized Shannon-type expansions (Kircheis et al., 2024) | Window | Exponential Rate | Compact Support | |---------------------------|----------------------------|-----------------| | Gaussian | eN(πδ)/2e^{-N(\pi-\delta)/2} | No | | Sinh-type | eN(πδ)e^{-N(\pi-\delta)} | Yes | | Kaiser–Bessel (cKB) | eN(πδ)e^{-N(\pi-\delta)} | Yes |

This regularization also ensures improved robustness: the noise amplification becomes O(N)O(\sqrt{N}) rather than diverging logarithmically (Kircheis et al., 2022).

3. Extensions: Non-uniform Sampling, Derivatives, Random and Sparse Schemes

Non-uniform and Derivative Data: Shannon himself noted that derivatives can be incorporated to enhance the WTWT product and enable reconstruction from non-uniform samples. The general formula involves Hermite-type expansions: f(t)=n{f(tn)ϕn(t)+f(tn)ψn(t)},f(t) = \sum_n \Big\{ f(t_n) \, \phi_n(t) + f'(t_n) \, \psi_n(t) \Big\}, where the interpolation kernels ϕn\phi_n, ψn\psi_n are explicit and depend on the sample locations (0905.0397). Multidimensional and higher derivative analogues are available via polynomial-exponential representations.

Random and Sparse Sampling: Classical sampling requires uniform samples at a rate determined by bandwidth. However, if the signal is sparse in the frequency domain, random subsampling is sufficient. Candès–Romberg–Tao showed that for an SS-sparse signal of length NN, random sampling at mCSlogNm \gtrsim C S \log N locations permits exact recovery with high probability by 1\ell_1-minimization (Javanmard et al., 2012). Further advances (spatially coupled Gabor sampling with approximate message passing) allow for reconstruction at the information-theoretic limit α>ρ\alpha > \rho (sampling rate exceeds sparsity) (Javanmard et al., 2012). Thus, in sparse regimes, “Shannon-type” recovery is attainable with highly sub-Nyquist sample counts.

Random Sampling and Nonuniform Interpolation: For random, non-uniform sampling, efficient practical recovery can be achieved by embedding the Shannon kernel in an observation matrix (using Poisson summation to accelerate convergence) and applying convex optimization (OMP or TV-minimization) for robust recovery (0909.2292).

4. Extensions Beyond Classical Function Spaces: Manifolds, Operators, Distributions, and Growth Relaxations

Bandlimited Spaces on Manifolds: On compact Riemannian manifolds, the Paley–Wiener space PWω(M)PW_\omega(M) is the span of eigenfunctions of the Laplace–Beltrami operator with eigenvalues up to ω\omega. Stable Shannon-type sampling holds provided the sampling set forms a metric lattice with spacing pω1/2p \sim \omega^{-1/2}; the number of samples must scale as Nωωd/2N_\omega \sim \omega^{d/2}, controlled by Weyl’s law (Pesenson, 2017, Pesenson, 2013).

Operators and Pseudo-differential Analogues: For integral and time-varying linear operators with bandlimited Kohn–Nirenberg symbols (i.e., the support of the spreading function lies in a compact set), operator-valued Shannon-type sampling theorems guarantee reconstruction from action on a Dirac-comb, with reconstruction kernels generalizing the sinc function (Pfander, 2010).

Distributions and Generalized Signal Classes: The distributional Shannon theorem enables sampling and reconstruction for signals whose Fourier transforms are compactly supported distributions, not just functions—allowing for the interpolation of signals with Dirac δ\delta's, their derivatives, or more general pointwise singularities (Sasane, 2012).

Signals with Growth: Dokuchaev constructs Shannon-type formulas for unbounded, non-decaying but sublinear-growth (x(t)=O((1+t)α), α<1|x(t)| = O((1+|t|)^\alpha),\ \alpha<1) bandlimited signals, by designing interpolation coefficients ak(t)a_k(t) that decay as O(1/kt2)O(1/|k-t|^2) (or faster with increased smoothness) (Dokuchaev, 2024).

Geometric and Multivariate Paradigms: Shannon-type sampling also generalizes to geometric contexts, replacing bandwidth with curvature (sampling density \sim reciprocal curvature), and to multivariate and polyspline settings (e.g., expansions on annuli via polyharmonic differential operators) (Saucan et al., 2010, 0809.5153).

5. Unlimited Sampling, Modulo ADCs, and Quantized Measurements

Practical analog-to-digital converters (ADCs) have finite amplitude range and suffer from saturation/clipping, violating the assumptions of classical Shannon sampling. Unlimited sampling eliminates this bottleneck by first mapping the analog signal to a bounded interval by a modulo (folding) operation, followed by sampling with a self-reset ADC (Bhandari et al., 2019): y[n]=Mλ(x(nT)),Mλ(u)=u2λu2λ+12.y[n] = M_\lambda\big(x(nT)\big),\qquad M_\lambda(u) = u - 2\lambda \Big\lfloor \frac{u}{2\lambda} + \frac12 \Big\rfloor. Perfect reconstruction is attainable, up to an overall constant, if the sampling period TT satisfies TΩe<1/2T\Omega e < 1/2, regardless of the ADC threshold λ\lambda or the amplitude bound AA—the necessary rate depends only on the bandwidth. The associated unwrapping and de-folding process employs finite-difference operations, anti-differencing, and rounding to reconstruct the true samples, followed by Shannon–Whittaker interpolation.

This framework supports stability under bounded noise, including quantization, and demonstrates accurate recovery of signals with arbitrarily large amplitudes using only the limited-range samples (Bhandari et al., 2019).

6. Shannon-type Sampling on Manifolds and in Geometric Settings

On compact Riemannian manifolds, “Shannon-type” sampling theorems connect the geometry (via Laplacian eigenvalues and the associated PWω(M)PW_\omega(M) spaces) with stable reconstruction. Sampling lattices with mesh-size pω1/2p \sim \omega^{-1/2} suffice for stable recovery, and the minimal sample cardinality is comparable to the spectral counting function (Pesenson, 2017, Pesenson, 2013).

In geometric signal processing, sampling density adapts to local curvature, paralleling the Nyquist interval. For a smooth submanifold ΣRN\Sigma \subset \mathbb{R}^N, the maximal principal curvature k(p)k(p) at pp dictates the maximal allowable sample spacing Δ(p)1/k(p)\Delta(p) \le 1/k(p). Fat triangulations provide explicit constructions for sampling schemes and vector quantization, linking curvature to coding dimensions (Saucan et al., 2010).

7. Summary Table of Key Shannon-Type Sampling Variants

Setting Key Reconstruction Formula Sampling Density/Rate Condition Notable Features
Classical Shannon (bandlimited) x(t)=x(nT)sinc()x(t) = \sum x(nT)\,\mathrm{sinc}(\cdot) T1/(2B)T \leq 1/(2B) (Nyquist) Uniform sampling, slow decay of sinc\mathrm{sinc}
Regularized (windowed) sampling x(nT)sinc()w()\sum x(nT)\,\mathrm{sinc}(\cdot)w(\cdot) T<TNyqT<T_{\rm Nyq}, window parameters tuned Exponential decay for proper windows; robust
Unlimited (modulo ADC) As above, with unwrapping/defolding TΩe<1/2T\Omega e < 1/2 (rate >2Ωe>2\Omega e) ADC threshold λ\lambda arbitrary; robust to quantization
Sparse/random sampling (compressed) Optimization (AMP/1\ell_1), nonuniform mρNm \gtrsim \rho N (sparsity-level) Shannon-type phase transitions; minimal rates
Differential/non-uniform (Hermite) x(tn)ϕn(t)+x(tn)ψn(t)\sum x(t_n)\phi_n(t) + x'(t_n)\psi_n(t) Location-dependent Uses derivative data, arbitrary grids
Manifold/PDO/operator Expansion in Laplacian eigenbasis/frames Lattice mesh pω1/2p\sim\omega^{-1/2} Links to Weyl’s law, curvature, frames
Distributional/signals with growth Coefficients with polynomial decay Mild oversampling, growth less than linear Supports non-decaying/unbounded signals

References

  • (Javanmard et al., 2012) Javanmard & Montanari, "Subsampling at Information Theoretically Optimal Rates"
  • (Bhandari et al., 2019) Bhandari et al., "On Unlimited Sampling and Reconstruction"
  • (Kircheis et al., 2024) Kircheis, Potts, Tasche, "Some remarks on regularized Shannon sampling formulas"
  • (Kircheis et al., 2023) Kircheis, Potts, Tasche, "On numerical realizations of Shannon's sampling theorem"
  • (Kircheis et al., 2022) Kircheis, Potts, Tasche, "On regularized Shannon sampling formulas with localized sampling"
  • (Lin, 2017, Lin et al., 2016) Lin & Zhang, "An Optimal Convergence Rate for the Gaussian Regularized Shannon Sampling Series"
  • (0809.5153) Kounchev & Render, "On a new multivariate sampling paradigm and a polyspline Shannon function"
  • (0909.2292) Wang & Sha, "Random Sampling Using Shannon Interpolation and Poisson Summation Formulae"
  • (Fujikawa et al., 2015) Saito et al., "Uncertainty principle, Shannon-Nyquist sampling and beyond"
  • (Pesenson, 2013, Pesenson, 2017) Pesenson et al., "Shannon Sampling and Parseval Frames on Compact Manifolds"/"Shannon sampling and Weak Weyl's Law on compact Riemannian manifolds"
  • (Pfander, 2010) Pfander, "Sampling of operators"
  • (Sasane, 2012) Bhandari et al., "Shannon's sampling theorem in a distributional setting"
  • (Dokuchaev, 2024) Dokuchaev, "Sampling Theorem and explicit interpolation formula for non-decaying unbounded signals"
  • (Saucan et al., 2010) Saucan, "Geometric approach to sampling and communication"
  • (0905.0397) Beutler, "A representation of non-uniformly sampled deterministic and random signals and their reconstruction using sample values and derivatives"
Definition Search Book Streamline Icon: https://streamlinehq.com
References (19)

Whiteboard

Topic to Video (Beta)

Follow Topic

Get notified by email when new papers are published related to Shannon-Type Sampling.