Efficient uniform approximation using Random Vector Functional Link networks (2306.17501v2)
Abstract: A Random Vector Functional Link (RVFL) network is a depth-2 neural network with random inner weights and biases. Only the outer weights of such an architecture are to be learned, so the learning process boils down to a linear optimization task, allowing one to sidestep the pitfalls of nonconvex optimization problems. In this paper, we prove that an RVFL with ReLU activation functions can approximate Lipschitz continuous functions in $L_\infty$ norm. To the best of our knowledge, our result is the first approximation result in $L_\infty$ norm using nice inner weights; namely, Gaussians. We give a nonasymptotic lower bound for the number of hidden-layer nodes to achieve a given accuracy with high probability, depending on, among other things, the Lipschitz constant of the target function, the desired accuracy, and the input dimension. Our method of proof is rooted in probability theory and harmonic analysis.
- A corrective view of neural networks: Representation, memorization and learning, 2020. arXiv: https://arxiv.org/abs/2002.00274v2.
- D Charalambos and Border Aliprantis. Infinite Dimensional Analysis: A Hitchhiker’s Guide. Springer-Verlag Berlin and Heidelberg GmbH & Company KG, 2013.
- A rapid learning and dynamic stepwise updating algorithm for flat neural networks and the application to time-series prediction. IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics), 29(1):62–72, 1999.
- NIST Digital Library of Mathematical Functions. https://dlmf.nist.gov/, Release 1.1.9 of 2023-03-15. F. W. J. Olver, A. B. Olde Daalhuis, D. W. Lozier, B. I. Schneider, R. F. Boisvert, C. W. Clark, B. R. Miller, B. V. Saunders, H. S. Cohl, and M. A. McClain, eds.
- Convolution roots of radial positive definite functions with compact support. Transactions of the American Mathematical Society, 356(11):4655–4685, 2004.
- On the approximation power of two-layer networks of random relus, 2021. arXiv: https://arxiv.org/abs/2102.02336v2.
- An ensemble of decision trees with random vector functional link networks for multi-class classification. Applied Soft Computing, 70:1146–1153, 2018.
- Random vector functional link network: recent developments, applications, and future directions, 2022. arXiv: https://arxiv.org/abs/2203.11316v1.
- Edward James McShane. Extension of range of functions. Bulletin of the American Mathematical Society, 40(12), 1934.
- Random vector functional link networks for function approximation on manifolds, 2022. arXiv: https://arxiv.org/abs/2007.15776v2.
- C Qu and R Wong. “best possible” upper and lower bounds for the zeros of the Bessel function jν(x)subscript𝑗𝜈𝑥\displaystyle j_{\nu}(x)italic_j start_POSTSUBSCRIPT italic_ν end_POSTSUBSCRIPT ( italic_x ). Transactions of the American Mathematical Society, 351(7):2833–2859, 1999.
- A non-iterative decomposition-ensemble learning paradigm using RVFL network for crude oil price forecasting. Applied Soft Computing, 70:1097–1108, 2018.
- James G Wendel. Note on the gamma function. The American Mathematical Monthly, 55(9):563–564, 1948.