Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
175 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Approximation of RKHS Functionals by Neural Networks (2403.12187v1)

Published 18 Mar 2024 in stat.ML, cs.LG, math.ST, and stat.TH

Abstract: Motivated by the abundance of functional data such as time series and images, there has been a growing interest in integrating such data into neural networks and learning maps from function spaces to R (i.e., functionals). In this paper, we study the approximation of functionals on reproducing kernel Hilbert spaces (RKHS's) using neural networks. We establish the universality of the approximation of functionals on the RKHS's. Specifically, we derive explicit error bounds for those induced by inverse multiquadric, Gaussian, and Sobolev kernels. Moreover, we apply our findings to functional regression, proving that neural networks can accurately approximate the regression maps in generalized functional linear models. Existing works on functional learning require integration-type basis function expansions with a set of pre-specified basis functions. By leveraging the interpolating orthogonal projections in RKHS's, our proposed network is much simpler in that we use point evaluations to replace basis function expansions.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (43)
  1. N. Aronszajn. Theory of Reproducing Kernels. Transactions of the American Mathematical Society, 68(3):337–404, 1950.
  2. Unified RKHS methodology and analysis for functional linear and single-index models. arXiv preprint arXiv:2206.03975, 2022.
  3. Radial kernels via scale derivatives. Advances in Computational Mathematics, 41:277–291, 2015.
  4. H. Brezis and H. Brézis. Functional Analysis, Sobolev Spaces and Partial Differential Equations, volume 2. Springer, 2011.
  5. Single and multiple index functional regression models with nonparametric link. The Annals of Statistics, 39(3):1720–1747, 2011.
  6. T. Chen and H. Chen. Universal approximation to nonlinear operators by neural networks with arbitrary activation functions and its application to dynamical systems. IEEE Transactions on Neural Networks, 6(4):911–917, 1995.
  7. J.-M. Chiou. Dynamical functional prediction and classification, with application to traffic flow prediction. The Annals of Applied Statistics, pages 1588–1614, 2012.
  8. Learning phrase representations using RNN encoder-decoder for statistical machine translation. In Conference on Empirical Methods in Natural Language Processing (EMNLP), 2014.
  9. G. Cybenko. Approximation by superpositions of a sigmoidal function. Mathematics of control, signals and systems, 2(4):303–314, 1989.
  10. On the approximation of functions by tanh neural networks. Neural Networks, 143:732–750, 2021.
  11. G. Fasshauer. Meshfree Methods. Handbook of theoretical and computational nanotechnology, 27:33–97, 2005.
  12. DeepGreen: deep learning of Green’s functions for nonlinear boundary value problems. Scientific reports, 11(1):21614, 2021.
  13. S. Greven and F. Scheipl. A general framework for functional regression modelling. Statistical Modelling, 17(1-2):1–35, 2017.
  14. S. Hochreiter and J. Schmidhuber. Long short-term memory. Neural Computation, 9(8):1735–1780, 1997.
  15. Gaussian processes and kernel methods: A review on connections and equivalences. arXiv preprint arXiv:1807.02582, 2018.
  16. On universal approximation and error bounds for Fourier Neural Operators. The Journal of Machine Learning Research, 22(1):13237–13312, 2021.
  17. Error estimates for DeepONets: A deep learning framework in infinite dimensions. Transactions of Mathematics and Its Applications, 6(1):tnac001, 2022.
  18. X. Leng and H.-G. Müller. Classification using functional data analysis for temporal gene expression data. Bioinformatics, 22(1):68–76, 2006.
  19. Fourier Neural Operator for parametric partial differential equations. The International Conference on Learning Representations (ICLR), 2021.
  20. Deep nonparametric estimation of operators between infinite dimensional spaces. The Journal of Machine Learning Research, 25:1–67, 2024.
  21. Learning nonlinear operators via DeepONet based on the universal approximation theorem of operators. Nature Machine Intelligence, 3(3):218–229, 2021.
  22. H. N. Mhaskar and N. Hahm. Neural networks for functional approximation and system identification. Neural Computation, 9(1):143–159, 1997.
  23. J. S. Morris. Functional regression. Annual Review of Statistics and its Application, 2:321–359, 2015.
  24. H. Muller and U. Stadtmüller. Generalized functional linear models. The Annals of Statistics, 33(2):774–805, 2005.
  25. Reproducing kernels of Sobolev Spaces on ℝdsuperscriptℝ𝑑\mathbb{R}^{d}blackboard_R start_POSTSUPERSCRIPT italic_d end_POSTSUPERSCRIPT and applications to embedding constants and tractability. Analysis and Applications, 16(05):693–715, 2018.
  26. M. Pazouki and R. Schaback. Bases for kernel-based spaces. Journal of Computational and Applied Mathematics, 236(4):575–588, 2011.
  27. Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations. Journal of Computational physics, 378:686–707, 2019.
  28. F. Rossi and B. Conan-Guez. Functional Multi-Layer Perceptron: a non-linear tool for functional data analysis. Neural networks, 18(1):45–60, 2005.
  29. R. Schaback. Reconstruction of multivariate functions from scattered data. Unpublished manuscript, 1997.
  30. R. Schaback and H. Wendland. Kernel techniques: from machine learning to meshless methods. Acta numerica, 15:543–639, 2006.
  31. Z. Shang and G. Cheng. Nonparametric inference in generalized functional linear models. The Annals of Statistics, 43(4):1742–1773, 2015.
  32. Approximation of nonlinear functionals using deep ReLU networks. Journal of Fourier Analysis and Applications, 29(50), 2023.
  33. M. B. Stinchcombe. Neural network approximation of continuous functionals and continuous functions on compactifications. Neural Networks, 12(3):467–477, 1999.
  34. N. Suh and G. Cheng. A survey on statistical theory of deep learning: Approximation, training dynamics, and generative models. Invited for review in Annual Review of Statistics and its Application, 2024.
  35. Learning theory for distribution regression. The Journal of Machine Learning Research, 17(1):5272–5311, 2016.
  36. Deep learning with functional inputs. Journal of Computational and Graphical Statistics, 32(1):171–180, 2023.
  37. Minimax estimation of kernel mean embeddings. The Journal of Machine Learning Research, 18(1):3002–3048, 2017.
  38. M. J. Wainwright. High-Dimensional Statistics: A Non-Asymptotic Viewpoint, volume 48. Cambridge university press, 2019.
  39. Functional data analysis. Annual Review of Statistics and its application, 3:257–295, 2016.
  40. Learning the solution operator of parametric partial differential equations with physics-informed DeepONets. Science advances, 7(40):eabi8605, 2021.
  41. Functional data analysis for sparse longitudinal data. Journal of the American Statistical Association, 100(470):577–590, 2005.
  42. Deep learning for functional data analysis with adaptive basis layers. In International Conference on Machine Learning (ICML), pages 11898–11908. PMLR, 2021.
  43. M. Yuan and T. T. Cai. A reproducing kernel Hilbert space approach to functional linear regression. The Annals of Statistics, 38(6):3412–3444, 2010.
Citations (2)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com