Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
162 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Optimal pointwise sampling for $L^2$ approximation (2105.05545v3)

Published 12 May 2021 in math.NA and cs.NA

Abstract: Given a function $u\in L2=L2(D,\mu)$, where $D\subset \mathbb Rd$ and $\mu$ is a measure on $D$, and a linear subspace $V_n\subset L2$ of dimension $n$, we show that near-best approximation of $u$ in $V_n$ can be computed from a near-optimal budget of $Cn$ pointwise evaluations of $u$, with $C>1$ a universal constant. The sampling points are drawn according to some random distribution, the approximation is computed by a weighted least-squares method, and the error is assessed in expected $L2$ norm. This result improves on the results in [6,8] which require a sampling budget that is sub-optimal by a logarithmic factor, thanks to a sparsification strategy introduced in [17,18]. As a consequence, we obtain for any compact class $\mathcal K\subset L2$ that the sampling number $\rho_{Cn}{\rm rand}(\mathcal K){L2}$ in the randomized setting is dominated by the Kolmogorov $n$-width $d_n(\mathcal K){L2}$. While our result shows the existence of a randomized sampling with such near-optimal properties, we discuss remaining issues concerning its generation by a computationally efficient algorithm.

Citations (32)

Summary

We haven't generated a summary for this paper yet.