Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
139 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Efficient uniform approximation using Random Vector Functional Link networks (2306.17501v2)

Published 30 Jun 2023 in stat.ML and cs.LG

Abstract: A Random Vector Functional Link (RVFL) network is a depth-2 neural network with random inner weights and biases. Only the outer weights of such an architecture are to be learned, so the learning process boils down to a linear optimization task, allowing one to sidestep the pitfalls of nonconvex optimization problems. In this paper, we prove that an RVFL with ReLU activation functions can approximate Lipschitz continuous functions in $L_\infty$ norm. To the best of our knowledge, our result is the first approximation result in $L_\infty$ norm using nice inner weights; namely, Gaussians. We give a nonasymptotic lower bound for the number of hidden-layer nodes to achieve a given accuracy with high probability, depending on, among other things, the Lipschitz constant of the target function, the desired accuracy, and the input dimension. Our method of proof is rooted in probability theory and harmonic analysis.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (13)
  1. A corrective view of neural networks: Representation, memorization and learning, 2020. arXiv: https://arxiv.org/abs/2002.00274v2.
  2. D Charalambos and Border Aliprantis. Infinite Dimensional Analysis: A Hitchhiker’s Guide. Springer-Verlag Berlin and Heidelberg GmbH & Company KG, 2013.
  3. A rapid learning and dynamic stepwise updating algorithm for flat neural networks and the application to time-series prediction. IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics), 29(1):62–72, 1999.
  4. NIST Digital Library of Mathematical Functions. https://dlmf.nist.gov/, Release 1.1.9 of 2023-03-15. F. W. J. Olver, A. B. Olde Daalhuis, D. W. Lozier, B. I. Schneider, R. F. Boisvert, C. W. Clark, B. R. Miller, B. V. Saunders, H. S. Cohl, and M. A. McClain, eds.
  5. Convolution roots of radial positive definite functions with compact support. Transactions of the American Mathematical Society, 356(11):4655–4685, 2004.
  6. On the approximation power of two-layer networks of random relus, 2021. arXiv: https://arxiv.org/abs/2102.02336v2.
  7. An ensemble of decision trees with random vector functional link networks for multi-class classification. Applied Soft Computing, 70:1146–1153, 2018.
  8. Random vector functional link network: recent developments, applications, and future directions, 2022. arXiv: https://arxiv.org/abs/2203.11316v1.
  9. Edward James McShane. Extension of range of functions. Bulletin of the American Mathematical Society, 40(12), 1934.
  10. Random vector functional link networks for function approximation on manifolds, 2022. arXiv: https://arxiv.org/abs/2007.15776v2.
  11. C Qu and R Wong. “best possible” upper and lower bounds for the zeros of the Bessel function jν⁢(x)subscript𝑗𝜈𝑥\displaystyle j_{\nu}(x)italic_j start_POSTSUBSCRIPT italic_ν end_POSTSUBSCRIPT ( italic_x ). Transactions of the American Mathematical Society, 351(7):2833–2859, 1999.
  12. A non-iterative decomposition-ensemble learning paradigm using RVFL network for crude oil price forecasting. Applied Soft Computing, 70:1097–1108, 2018.
  13. James G Wendel. Note on the gamma function. The American Mathematical Monthly, 55(9):563–564, 1948.

Summary

We haven't generated a summary for this paper yet.