2000 character limit reached
Universal approximation properties of shallow quadratic neural networks (2110.01536v2)
Published 4 Oct 2021 in math.NA and cs.NA
Abstract: In this paper we study shallow neural network functions which are linear combinations of compositions of activation and quadratic functions, replacing standard affine linear functions, often called neurons. We show the universality of this approximation and prove convergence rates results based on the theory of wavelets and statistical learning. We show for simple test cases that this ansatz requires a smaller numbers of neurons than standard affine linear neural networks. Moreover, we investigate the efficiency of this approach for clustering tasks with the MNIST data set. Similar observations are made when comparing deep (multi-layer) networks.