Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
131 tokens/sec
GPT-4o
10 tokens/sec
Gemini 2.5 Pro Pro
47 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Approximation by Combinations of ReLU and Squared ReLU Ridge Functions with $ \ell^1 $ and $ \ell^0 $ Controls (1607.07819v3)

Published 26 Jul 2016 in stat.ML, math.ST, and stat.TH

Abstract: We establish $ L{\infty} $ and $ L2 $ error bounds for functions of many variables that are approximated by linear combinations of ReLU (rectified linear unit) and squared ReLU ridge functions with $ \ell1 $ and $ \ell0 $ controls on their inner and outer parameters. With the squared ReLU ridge function, we show that the $ L2 $ approximation error is inversely proportional to the inner layer $ \ell0 $ sparsity and it need only be sublinear in the outer layer $ \ell0 $ sparsity. Our constructions are obtained using a variant of the Jones-Barron probabilistic method, which can be interpreted as either stratified sampling with proportionate allocation or two-stage cluster sampling. We also provide companion error lower bounds that reveal near optimality of our constructions. Despite the sparsity assumptions, we showcase the richness and flexibility of these ridge combinations by defining a large family of functions, in terms of certain spectral conditions, that are particularly well approximated by them.

Citations (137)

Summary

We haven't generated a summary for this paper yet.