Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
126 tokens/sec
GPT-4o
47 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Deep Learning in High Dimension: Neural Network Approximation of Analytic Functions in $L^2(\mathbb{R}^d,γ_d)$ (2111.07080v1)

Published 13 Nov 2021 in math.NA, cs.NA, math.PR, and stat.ML

Abstract: For artificial deep neural networks, we prove expression rates for analytic functions $f:\mathbb{R}d\to\mathbb{R}$ in the norm of $L2(\mathbb{R}d,\gamma_d)$ where $d\in {\mathbb{N}}\cup{ \infty }$. Here $\gamma_d$ denotes the Gaussian product probability measure on $\mathbb{R}d$. We consider in particular ReLU and ReLU${}k$ activations for integer $k\geq 2$. For $d\in\mathbb{N}$, we show exponential convergence rates in $L2(\mathbb{R}d,\gamma_d)$. In case $d=\infty$, under suitable smoothness and sparsity assumptions on $f:\mathbb{R}{\mathbb{N}}\to\mathbb{R}$, with $\gamma_\infty$ denoting an infinite (Gaussian) product measure on $\mathbb{R}{\mathbb{N}}$, we prove dimension-independent expression rate bounds in the norm of $L2(\mathbb{R}{\mathbb{N}},\gamma_\infty)$. The rates only depend on quantified holomorphy of (an analytic continuation of) the map $f$ to a product of strips in $\mathbb{C}d$. As an application, we prove expression rate bounds of deep ReLU-NNs for response surfaces of elliptic PDEs with log-Gaussian random field inputs.

Citations (3)

Summary

We haven't generated a summary for this paper yet.