2000 character limit reached
Neural network integral representations with the ReLU activation function (1910.02743v3)
Published 7 Oct 2019 in cs.LG and stat.ML
Abstract: In this effort, we derive a formula for the integral representation of a shallow neural network with the ReLU activation function. We assume that the outer weighs admit a finite $L_1$-norm with respect to Lebesgue measure on the sphere. For univariate target functions we further provide a closed-form formula for all possible representations. Additionally, in this case our formula allows one to explicitly solve the least $L_1$-norm neural network representation for a given function.