Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Wide stable neural networks: Sample regularity, functional convergence and Bayesian inverse problems (2407.03909v1)

Published 4 Jul 2024 in math.ST, math.PR, and stat.TH

Abstract: We study the large-width asymptotics of random fully connected neural networks with weights drawn from $\alpha$-stable distributions, a family of heavy-tailed distributions arising as the limiting distributions in the Gnedenko-Kolmogorov heavy-tailed central limit theorem. We show that in an arbitrary bounded Euclidean domain $\mathcal{U}$ with smooth boundary, the random field at the infinite-width limit, characterized in previous literature in terms of finite-dimensional distributions, has sample functions in the fractional Sobolev-Slobodeckij-type quasi-Banach function space $W{s,p}(\mathcal{U})$ for integrability indices $p < \alpha$ and suitable smoothness indices $s$ depending on the activation function of the neural network, and establish the functional convergence of the processes in $\mathcal{P}(W{s,p}(\mathcal{U}))$. This convergence result is leveraged in the study of functional posteriors for edge-preserving Bayesian inverse problems with stable neural network priors.

Summary

We haven't generated a summary for this paper yet.