Papers
Topics
Authors
Recent
Search
2000 character limit reached

Chebyshev Polynomial Expansions

Updated 3 March 2026
  • Chebyshev polynomial expansions are defined by orthogonality and recurrence relations, enabling stable numerical approximations of functions and operators.
  • They achieve near-minimax uniform approximations with exponentially decaying coefficients for analytic functions, ensuring rapid convergence and error control.
  • Applications span computational physics, PDE solvers, and machine learning via efficient, matrix-free algorithms and robust spectral methods.

Chebyshev polynomial expansions provide a robust, numerically stable framework for approximating functions, operators, and solutions to differential equations. Exploiting the orthogonality, recurrence, and near-minimax properties of Chebyshev polynomials—especially the first kind, Tn(x)=cos(narccosx)T_n(x)=\cos(n\arccos x) defined on [1,1][-1,1]—these expansions underpin both classical harmonic analysis and cutting-edge computational algorithms across applied mathematics, computational physics, and machine learning.

1. Algebraic and Analytic Structure

Chebyshev polynomials of the first kind, Tn(x)T_n(x), admit several explicit representations: the trigonometric formula Tn(x)=cos(narccosx)T_n(x)=\cos(n\arccos x) for x[1,1]x\in[-1,1], and a stable three-term recurrence Tn+1(x)=2xTn(x)Tn1(x)T_{n+1}(x)=2xT_n(x)-T_{n-1}(x) with T0(x)=1T_0(x)=1, T1(x)=xT_1(x)=x (Chen et al., 2 Feb 2026, Saibaba, 2021). Orthogonality with respect to the weight (1x2)1/2(1-x^2)^{-1/2} on [1,1][-1,1] leads to

11Tj(x)Tk(x)(1x2)1/2dx={π,j=k=0, π2,j=k>0, 0,jk,\int_{-1}^1 T_j(x)\,T_k(x)\,(1-x^2)^{-1/2}\,dx = \begin{cases} \pi, & j=k=0, \ \frac{\pi}{2}, & j=k>0, \ 0, & j\neq k, \end{cases}

which ensures that the expansion coefficients of any sufficiently smooth f(x)f(x) are uniquely determined by

an=2δn,0π11f(x)Tn(x)(1x2)1/2dx.a_n=\frac{2-\delta_{n,0}}{\pi} \int_{-1}^1 f(x) T_n(x) (1-x^2)^{-1/2} dx.

The three-term recurrence and tight Tn(x)1|T_n(x)|\leq1 bound yield strong numerical stability for both evaluation and manipulation of Chebyshev expansions even at high degree (Chen et al., 2 Feb 2026).

2. Spectral, Minimax, and Conditioning Properties

Chebyshev expansions furnish a near-minimax uniform approximation among all degree-NN polynomials: truncating a Chebyshev series at degree NN yields a polynomial whose LL^\infty-error is within a logarithmic factor of the best possible uniform approximation (Benoit et al., 2014, Chen et al., 2 Feb 2026). The Lebesgue constant for Chebyshev interpolation nodes grows only logarithmically, ΛN(2/π)logN+O(1)\Lambda_N \leq (2/\pi)\log N + O(1), guaranteeing that polynomial interpolation and quadrature at these nodes remain numerically stable and protected from Runge-type divergence (Chen et al., 2 Feb 2026).

Conditioning improves dramatically compared to monomial bases: the Gram matrix Gjk=11Tj(x)Tk(x)dxG_{jk}=\int_{-1}^1 T_j(x)T_k(x)dx is diagonally dominant, so κ(G)\kappa(G) grows only polynomially, whereas for monomials it can be exponential in NN (Chen et al., 2 Feb 2026). This underlies superior backward stability in spectral algorithms and stable gradient flow in physics-informed neural architectures."

3. Expansion, Truncation, and Fast Transforms

For analytic f(x)f(x), the Chebyshev coefficients ana_n decay exponentially fast: an2Mρn|a_n| \leq \frac{2M}{\rho^n} if ff can be analytically continued to a Bernstein ellipse with parameter ρ>1\rho>1 (Chen et al., 2 Feb 2026, Gull et al., 2018, Benoit et al., 2014). Consequently, the NN-term truncation error is exponentially small: fk=0NakTkLCρNρ1.\|f - \sum_{k=0}^N a_k T_k\|_{L^\infty} \leq \frac{C\rho^{-N}}{\rho-1}. In practical computation, coefficients can be efficiently computed via the discrete cosine transform (DCT) or Clenshaw–Curtis quadrature by evaluating ff at Chebyshev nodes xj=cos(jπ/N)x_j = \cos(j\pi/N), j=0,,Nj=0,\dots, N, at a computational cost of O(NlogN)O(N\log N) (Chen et al., 2 Feb 2026, Gull et al., 2018).

Efficient evaluation at arbitrary xx is achieved using Clenshaw’s recurrence, and validated interval enclosures for Chebyshev expansions are available via the Laurent–Horner algorithm, which is asymptotically optimal and avoids endpoint instability (Aurentz et al., 2024).

4. Probabilistic Error Bounds and Monomial Approximations

Chebyshev expansions of monomials xnx^n admit explicit binomial coefficient formulas for the expansion coefficients. Truncating the expansion at degree mm leads to a supremum-norm error En,mE_{n,m}, which has a sharp probabilistic interpretation:

En,m=P(Binomial(n,1/2)(n+m)/2)E_{n,m} = \mathrm{P}\left(\text{Binomial}(n, 1/2) \geq (n+m)/2 \right)

and satisfies the nonasymptotic exponential tail bound En,m2exp(m2/(2n))E_{n,m} \leq 2 \exp(-m^2/(2n)) by Hoeffding’s inequality (Saibaba, 2021). This formalizes both the rapid decay of Chebyshev expansion tails and the near-optimality for high-degree polynomial approximation.

5. Function and Operator Expansions: Matrix Functions and Differential Equations

Chebyshev expansions are exploited for operator functions, especially for matrix-valued functions such as f(A)f(A) for Hermitian AA with known spectral bounds. Affine spectral rescaling and Chebyshev recurrences enable fast, matrix-free algorithms for evaluating functions like the matrix logarithm or exponential, central to stochastic trace estimators (e.g., log-determinant computation) (Han et al., 2015, Castro et al., 2022). In these contexts,

f(A)k=0mckTk(A~),f(A) \approx \sum_{k=0}^m c_k T_k(\tilde A),

where A~\tilde A is rescaled to [1,1][-1,1], and the coefficients ckc_k are derived from scalar Chebyshev expansions.

In PDE and D-finite function solution frameworks, expansions y(x)=ncnTn(x)y(x)=\sum_n c_n T_n(x) can be combined with operator recurrences to produce efficient linear-algebraic solution schemes. For linear ODEs with polynomial coefficients, substitution induces recurrences on the Chebyshev coefficients solved by backward recursion (Clenshaw/Miller) plus functional enclosures for rigorous error bounds, achieving both certified uniform approximation and complexity that is essentially linear in degree (0906.2888, Benoit et al., 2014).

6. Applications in Machine Learning and Applied Physics

Recent advances in spectral deep learning leverage Chebyshev expansions inside neural operators for PDE surrogates and in polynomial-parametric CNN architectures. The Physics-Informed Chebyshev Polynomial Neural Operator (CPNO) encodes inputs in the Chebyshev spectral basis, replacing monomial expansions, thereby stabilizing optimization, decoupling approximation from MLP-specific constraints, and enhancing robustness and multi-scale expressivity. CPNO achieves faster convergence and superior accuracy in parameterized PDE problems, with empirical spectral convergence saturating after moderate Chebyshev order P5P \approx 5 (Chen et al., 2 Feb 2026).

Hybrid CNNs using Chebyshev expansions in convolutional layers attain state-of-the-art classification accuracy on medical imaging datasets, benefiting from high-frequency sensitivity, basis orthogonality, efficient recurrence construction, and minimax approximation qualities (Roy et al., 9 Apr 2025).

In quantum many-body and condensed matter physics, Chebyshev polynomial representations of Green's functions and self-energies in imaginary time enable dense linear-algebraic reformulations of key operations (e.g., convolutions, Dyson equations, Fourier/Matsubara transforms) with exponential convergence and superior error control compared to fixed or adaptive grids (Gull et al., 2018). Large-scale quantum transport and tight-binding Green's function calculations further exploit Chebyshev expansions for efficient and accurate conductance evaluation, with spectral smoothing kernels (Jackson, Lorentz) controlling truncation artifacts (Castro et al., 2022).

7. Basis Transformation, Special Expansions, and Algebraic Generalizations

Transforming between the monomial basis and the Chebyshev basis employs explicit change-of-basis sequences with three-term recurrences, enabling stable high-degree expansions with closed-form rational formulas or recurrences for special classes of polynomials (e.g., Zernike, ultraspherical, sieved random walk polynomials) (Mathar, 23 Sep 2025, Kahler, 2023). In combinatorial and algebraic contexts, inverted Chebyshev expansions relate the gamma-vector of palindromic polynomials to Chebyshev-coefficient transforms, providing real-rootedness criteria, connections to triangulation combinatorics, ce-indices, and Hopf/quasisymmetric algebraic structures (Park, 2024).

Exponential divided differences, essential in numerical linear algebra and quantum Monte Carlo, are efficiently evaluated by Chebyshev–Bessel expansions with O(qN)O(qN) complexity and incremental O(N)O(N) update schemes, robustly controlled by Bessel tail decay analysis (Hen, 28 Dec 2025).

8. Error Quantification, Rigorous Bounds, and Theoretical Guarantees

Truncation errors for Chebyshev expansions of analytic functions admit explicit uniform bounds and can be certified a posteriori by operator-theoretic or enclosure methods (Benoit et al., 2014, Chen et al., 2 Feb 2026). Chebyshev truncations of exe^x and other entire functions support Taylor-like one-sided inequalities and two-sided bounds on x<0x<0, with auxiliary polynomial criteria linked to the Chebyshev second kind Un(x)U_n(x) and the identity theorem for holomorphic functions in C\mathbb{C} (Wodecki, 2024).

These error characterizations, together with nearly optimal minimax approximation factors and the controlled growth of associated Lebesgue constants, render Chebyshev expansions a foundation for rigorous, high-accuracy numerical analysis and computational mathematics.


References:

(Chen et al., 2 Feb 2026, Saibaba, 2021, Han et al., 2015, 0906.2888, Aurentz et al., 2024, Mathar, 23 Sep 2025, Roy et al., 9 Apr 2025, Gull et al., 2018, Benoit et al., 2014, Kahler, 2023, Castro et al., 2022, Hen, 28 Dec 2025, Wodecki, 2024, Park, 2024, Heath et al., 2019).

Topic to Video (Beta)

No one has generated a video about this topic yet.

Whiteboard

No one has generated a whiteboard explanation for this topic yet.

Follow Topic

Get notified by email when new papers are published related to Chebyshev Polynomial Expansions.