Papers
Topics
Authors
Recent
Search
2000 character limit reached

Weak Tchebychev Greedy Algorithm

Updated 20 January 2026
  • WCGA is a nonlinear greedy algorithm for sparse approximation in Banach spaces featuring adaptive atom selection and optimal Chebyshev projection.
  • It achieves Lebesgue-type error bounds under mild geometric and incoherence conditions, ensuring near-optimal recovery rates.
  • The algorithm is robust to numerical errors and versatile in applications ranging from function recovery to convex optimization.

The Weak Tchebychev Greedy Algorithm (WCGA) is a foundational nonlinear greedy algorithm for sparse approximation and sampling recovery in Banach spaces, generalizing classical greedy methods such as Orthogonal Matching Pursuit to settings lacking Hilbert space structure. The WCGA combines adaptive atom selection with optimal projection onto incrementally constructed subspaces, achieving Lebesgue-type optimality bounds under mild geometric conditions on the ambient space and dictionary. It is a central tool in approximation theory and information-based complexity, with applications ranging from function recovery from sparse samples to rational approximation in operator theory and convex optimization.

1. Formal Definition and Implementation

Let XX be a (real or complex) Banach space equipped with norm X\|\cdot\|_X, and let D={gi}i=1NX\mathcal D = \{g_i\}_{i=1}^N \subset X be a bounded system of elements with giX1\|g_i\|_X \leq 1. Fix a weakness parameter t(0,1]t \in (0,1]. For a given target f0Xf_0 \in X, the WCGA constructs, for m=1,2,m=1,2,\dots, iteratively:

  • A residual fm1f_{m-1} and its corresponding norming (peak) functional Ffm1XF_{f_{m-1}} \in X^* such that Ffm1X=1\|F_{f_{m-1}}\|_{X^*}=1 and Ffm1(fm1)=fm1XF_{f_{m-1}}(f_{m-1}) = \|f_{m-1}\|_X.
  • Selects θmD\theta_m \in \mathcal D such that

Ffm1(θm)tsupgDFfm1(g).|F_{f_{m-1}}(\theta_m)| \geq t \sup_{g\in\mathcal D} |F_{f_{m-1}}(g)|.

  • Defines Vm=span{θ1,,θm}V_m = \operatorname{span}\{\theta_1, \dots, \theta_m\}.
  • Computes the Chebyshev projection GmG_m of f0f_0 onto VmV_m, i.e.,

Gm=argminGVmf0GX,G_m = \arg\min_{G \in V_m} \|f_0 - G\|_X,

and updates fm=f0Gmf_m = f_0 - G_m.

This process is repeated for a predetermined number of iterations or until a desired residual norm is achieved. The algorithm optimizes over the entire span VmV_m at each step, crucially differing from thresholding-type greedy schemes which only update by the newly selected atom.

2. Geometric and Dictionary Structural Assumptions

The theoretical guarantees of the WCGA depend on several geometric properties of XX and structural properties of D\mathcal D:

  • Uniform Smoothness: XX is assumed to be uniformly smooth, i.e., its modulus of smoothness satisfies ρX(τ)γτq\rho_X(\tau) \leq \gamma \tau^q for some 1<q21 < q \leq 2 and γ>0\gamma > 0.
  • Incoherence Property: D\mathcal D satisfies a (v,S)(v,S)–incoherence property with parameters (V>0,r>0)(V>0, r>0) if for all AB{1,,N}A \subset B \subset \{1,\dots,N\} with Av|A| \leq v, BS|B| \leq S, and any scalars {ci}iB\{c_i\}_{i \in B},

iAciVAriBcigiX.\sum_{i \in A} |c_i| \leq V |A|^r \left\|\sum_{i \in B} c_i g_i \right\|_X.

  • (Optional) Unconditionality: For sharper bounds, some results require a (v,S)(v,S)–unconditionality property: iAcigiUiBcigiX\|\sum_{i\in A} c_i g_i\| \leq U \|\sum_{i\in B} c_i g_i\|_X.

These parameters determine the approximation and recovery rates of the WCGA and govern the complexity of sparse approximation.

3. Lebesgue-Type Inequalities and Optimality Rates

The core theoretical result is a Lebesgue-type inequality relating WCGA's mm-term error to the best vv-term error (defined as σv(f0,D)X=infgΣv(D)f0gX\sigma_v(f_0, \mathcal D)_X = \inf_{g \in \Sigma_v(\mathcal D)} \|f_0 - g\|_X):

  • Theorem (Lebesgue-Type for WCGA): If XX is uniformly smooth with ρX(τ)γτq\rho_X(\tau) \leq \gamma \tau^q and D\mathcal D is (v,S)(v,S)–incoherent with (V,r)(V, r), then for q=q/(q1)q' = q/(q-1), after m=v+vm = v + v' steps with

v=CVq(ln2Vv)rq,v+vSv' = \left\lfloor C V^{q'} (\ln 2Vv)^{r q'} \right\rfloor, \quad v + v' \leq S

the residual satisfies

f0Gv+vXC(t,γ,q)σv(f0,D)X,\|f_0 - G_{v+v'}\|_X \leq C'(t, \gamma, q) \, \sigma_v(f_0, \mathcal D)_X,

where the constants are explicit in the problem parameters (Temlyakov, 2023, Dai et al., 13 Jan 2026). In LpL_p spaces, q=min{p,2}q = \min\{p,2\}, so these bounds are fully explicit and show near-optimality (modulo logarithmic factors) compared to best vv-term nonlinear approximation.

Significantly, for systems like Riesz or orthonormal bases, VV, rr, and the required mm can be made explicit, yielding precise recovery complexity and optimality rates for various function classes.

4. WCGA in Sampling Recovery and Universal Discretization

In sampling recovery, the aim is to reconstruct functions from finite samples. The synthesis of universal sampling discretization with the WCGA yields explicit and near-optimal recovery in LpL_p:

  • Universal LpL_p Sampling Discretization: A finite set S={xj}j=1mS=\{x_j\}_{j=1}^m provides LpL_p-universal discretization for a family of subspaces {X(n)}\{X(n)\} if there exist constants C1,C2C_1, C_2 such that for all fX(n)f\in X(n),

C1fLpp1mj=1mf(xj)pC2fLpp.C_1\|f\|_{L_p}^p \leq \frac{1}{m}\sum_{j=1}^m |f(x_j)|^p \leq C_2\|f\|_{L_p}^p.

  • Discretized WCGA: Upon restriction of D\mathcal D to the sampled points and application of the WCGA in p\ell_p over those sampled vectors, one obtains discrete Lebesgue-type bounds transferable to the continuous LpL_p norm via the discretization inequalities.

For nonlinear classes defined via coefficient decay in certain systems (e.g., Ar,τ(Y)A^{r, \tau}(\mathcal Y)), this combination yields recovery rates

fGuLpv(1/2+r/d)\|f - G_u\|_{L_p} \lesssim v^{-(1/2 + r/d)}

with uu polynomial or near-linear in vv (modulo logarithmic factors), and sample complexity mm explicit in vv and pp (Temlyakov, 2023).

5. Robustness to Numerical Error and Approximate Variants

The WCGA is stable under computational inexactness both in the selection and projection steps:

  • Generalized AWCGA: Allowing absolute and relative error parameters (perturbations, selection errors, projection inaccuracies), convergence is preserved provided the errors decay appropriately—essentially, errors must be o(tnp)o(t_n^p) along a subsequence if tnt_n denotes the weakness parameter and p=q/(q1)p=q/(q-1) (Dereventsov, 2016, Dereventsov, 2015, Dereventsov et al., 2018).
  • Sharpness: If errors accumulate too rapidly (fail to be in 1\ell_1 relative to the tnpt_n^p scale), divergence or arbitrarily slow convergence can occur.
  • Practical Implication: Modest computational error does not impact the asymptotic rates: as long as inexactness is suitably controlled, the main Lebesgue-type inequalities still hold (Dereventsov et al., 2018).

6. Applications and Connections

The WCGA is central in a diverse range of applied and theoretical domains:

  • Sparse Sampling and Nonlinear Recovery: Recovery of high-dimensional functions from finite or scattered data, with rates closely tracking Kolmogorov widths and nonlinear approximation errors for classes including trigonometric, Haar, wavelet, Riesz, and Wiener systems (Temlyakov, 2023, Dai et al., 13 Jan 2026).
  • Convex Optimization: The WCGA(co) variant constructs sparse minimizers for convex objectives with projection steps onto best mm-term approximants, outperforming purely regularized approaches in certain regimes (Dereventsov et al., 2020).
  • Rational Approximation and Operator Preconditioning: Adaptations of the WCGA to uniform norms yield robust rational approximants suitable for matrix functions such as fractional Laplacians, maintaining monotone error decay and compatibility with fast iterative linear solvers (Adler et al., 2024).

Its adaptability across Banach/Hilbert structures, strong nonlinear error guarantees, and explicit connection to geometric and dictionary properties underline its pivotal role in modern approximation and recovery theory.

7. Summary Table: Key Properties

Aspect WCGA Characterization Reference
Ambient space Uniformly smooth Banach space (Temlyakov, 2023, Dai et al., 13 Jan 2026)
Dictionary property Bounded, possibly redundant; incoherence needed (Temlyakov, 2023, Temlyakov, 2013)
Atom selection criterion Norming functional, weakness tt (Temlyakov, 2023, Temlyakov, 2013)
Projection step Chebyshev (best) onto current span (Temlyakov, 2023, Dai et al., 13 Jan 2026)
Optimality (Lebesgue-type) bound Best vv-term error up to logarithmic/constant factor (Temlyakov, 2023, Dilworth et al., 2019)
Sample complexity in LpL_p recovery Explicit via incoherence and discretization parameters (Temlyakov, 2023)
Robust to numerical error Yes, under mild error decay (Dereventsov, 2016, Dereventsov, 2015)
Admits convex/complex/approximate variants Yes (Dereventsov et al., 2020, Gasnikov et al., 2024)

All results are fully constructive, explicit, and transferable to a broad spectrum of Banach-space and sampling frameworks, establishing the WCGA as the archetype for nonlinear, greedy, and sampling-based approximation algorithms.

Topic to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this topic yet.

Follow Topic

Get notified by email when new papers are published related to Weak Tchebychev Greedy Algorithm (WCGA).