Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
158 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Convergence bounds for empirical nonlinear least-squares (2001.00639v5)

Published 2 Jan 2020 in math.NA, cs.NA, and math.PR

Abstract: We consider best approximation problems in a nonlinear subset $\mathcal{M}$ of a Banach space of functions $(\mathcal{V},|\bullet|)$. The norm is assumed to be a generalization of the $L2$-norm for which only a weighted Monte Carlo estimate $|\bullet|_n$ can be computed. The objective is to obtain an approximation $v\in\mathcal{M}$ of an unknown function $u \in \mathcal{V}$ by minimizing the empirical norm $|u-v|_n$. We consider this problem for general nonlinear subsets and establish error bounds for the empirical best approximation error. Our results are based on a restricted isometry property (RIP) which holds in probability and is independent of the nonlinear least squares setting. Several model classes are examined where analytical statements can be made about the RIP and the results are compared to existing sample complexity bounds from the literature. We find that for well-studied model classes our general bound is weaker but exhibits many of the same properties as these specialized bounds. Notably, we demonstrate the advantage of an optimal sampling density (as known for linear spaces) for sets of functions with sparse representations.

Citations (17)

Summary

We haven't generated a summary for this paper yet.