2000 character limit reached
A multivariate Riesz basis of ReLU neural networks (2303.00076v1)
Published 28 Feb 2023 in cs.IT, math.FA, and math.IT
Abstract: We consider the trigonometric-like system of piecewise linear functions introduced recently by Daubechies, DeVore, Foucart, Hanin, and Petrova. We provide an alternative proof that this system forms a Riesz basis of $L_2([0,1])$ based on the Gershgorin theorem. We also generalize this system to higher dimensions $d>1$ by a construction, which avoids using (tensor) products. As a consequence, the functions from the new Riesz basis of $L_2([0,1]d)$ can be easily represented by neural networks. Moreover, the Riesz constants of this system are independent of $d$, making it an attractive building block regarding future multivariate analysis of neural networks.