Universal Approximation Theorem for Vector- and Hypercomplex-Valued Neural Networks (2401.02277v2)
Abstract: The universal approximation theorem states that a neural network with one hidden layer can approximate continuous functions on compact sets with any desired precision. This theorem supports using neural networks for various applications, including regression and classification tasks. Furthermore, it is valid for real-valued neural networks and some hypercomplex-valued neural networks such as complex-, quaternion-, tessarine-, and Clifford-valued neural networks. However, hypercomplex-valued neural networks are a type of vector-valued neural network defined on an algebra with additional algebraic or geometric properties. This paper extends the universal approximation theorem for a wide range of vector-valued neural networks, including hypercomplex-valued models as particular instances. Precisely, we introduce the concept of non-degenerate algebra and state the universal approximation theorem for neural networks defined on such algebras.
- Aizenberg, I. N. Complex-Valued Neural Networks with Multi-Valued Neurons, 1 ed., vol. 353 of Studies in Computational Intelligence. Springer, Berlin Heidelberg, 2011.
- Multilayer perceptrons to approximate quaternion valued functions. Neural Networks 10, 2 (1 1997), 335–342.
- Neural networks in multidimensional domains: fundamentals and new trends in modeling and control. Springer London, 1998.
- Baez, J. C. The octonions. Bulletin of the American Mathematical Society 39 (2002), 145–205.
- Efficient Sound Event Localization and Detection in the Quaternion Domain. IEEE Transactions on Circuits and Systems II: Express Briefs (2022), 1–5.
- A hyperbolic multilayer perceptron. In Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks (IJCNN 2000) (jul 2000), vol. 2, IEEE, pp. 129–133.
- Clifford Algebra Multilayer Perceptrons. Springer Berlin Heidelberg, Berlin, Heidelberg, 2001, pp. 315–334.
- On Clifford neurons and Clifford multi-layer perceptrons. Neural Networks 21, 7 (9 2008), 925–935.
- Universal Approximation Theorem for Tessarine-Valued Neural Networks. Anais do Encontro Nacional de Inteligência Artificial e Computacional (ENIAC) (11 2021), 233–243.
- The Mathematics of Minkowski Space-Time. Birkhäuser Basel, 2008.
- Cerroni, C. From the theory of congeneric surd equations to segre’s bicomplex numbers. Historia Mathematica 44, 3 (2017), 232–251.
- Cybenko, G. Approximation by superpositions of a sigmoidal function. Mathematics of Control, Signals and Systems 1989 2:4 2, 4 (12 1989), 303–314.
- Continuous-Valued Octonionic Hopfield Neural Network. In Proceedings Series of the Brazilian Society of Computational and Applied Mathematics (São José dos Campos – Brazil, 2 2018), vol. 6, pp. 1–7.
- Online Regularization of Complex-Valued Neural Networks for Structure Optimization in Wireless-Communication Channel Prediction. IEEE Access 8 (2020), 143706–143722.
- Funahashi, K.-I. On the approximate realization of continuous mappings by neural networks. Neural networks 2, 3 (1989), 183–192.
- Dual quaternion ambisonics array for six-degree-of-freedom acoustic representation. Pattern Recognition Letters 166 (2 2023), 24–30.
- PHNNs: Lightweight Neural Networks via Parameterized Hypercomplex Convolutions. IEEE Transactions on Neural Networks and Learning Systems (10 2022), 1–13.
- Learning Speech Emotion Representations in the Quaternion Domain. IEEE/ACM Transactions on Audio, Speech, and Language Processing 31 (2023), 1200–1212.
- Hirose, A. Complex-Valued Neural Networks, 2nd edition ed. Studies in Computational Intelligence. Springer, Heidelberg, Germany, 2012.
- Hornik, K. Approximation capabilities of multilayer feedforward networks. Neural networks 4, 2 (1991), 251–257.
- Multilayer feedforward networks are universal approximators. Neural Networks 2, 5 (1 1989), 359–366.
- Klein four-subgroups of lie algebra automorphisms. Pacific Journal of Mathematics 262, 2 (2013), 397–420.
- Power Quality Analysis Based on Machine Learning Methods for Low-Voltage Electrical Distribution Lines. Energies 16, 9 (4 2023), 3627.
- Hypercomplex numbers: an elementary introduction to algebras, vol. 302. Vol. 302. New York: Springer-Verlag,, 1989.
- Kobayashi, M. Hopfield neural networks using klein four-group. Neurocomputing 387 (2020), 123–128.
- A model of Hopfield-type octonion neural networks and existing conditions of energy functions. In 2016 International Joint Conference on Neural Networks (IJCNN) (2016), pp. 4426–4430.
- Multilayer feedforward networks with a nonpolynomial activation function can approximate any function. Neural networks 6, 6 (1993), 861–867.
- Full-Learning Rotational Quaternion Convolutional Neural Networks and Confluence of Differently Represented Data for PolSAR Land Classification. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing 15 (2022), 2914–2928.
- Tessarine Signal Processing under the T-Properness Condition. Journal of the Franklin Institute (8 2020).
- Proper ARMA Modeling and Forecasting in the Generalized Segre’s Quaternions Domain. Mathematics 2022, Vol. 10, Page 1083 10, 7 (3 2022), 1083.
- On the decision boundaries of hyperbolic neurons. In 2008 IEEE International Joint Conference on Neural Networks (IEEE World Congress on Computational Intelligence) (2008), pp. 2974–2980.
- Hyperbolic gradient operator and hyperbolic back-propagation learning algorithms. IEEE Transactions on Neural Networks and Learning Systems 29, 5 (2018), 1689–1702.
- On 4-Dimensional Hypercomplex Algebras in Adaptive Signal Processing. Smart Innovation, Systems and Technologies 102 (6 2017), 131–140.
- A survey of quaternion neural networks. Artificial Intelligence Review 53, 4 (4 2020), 2957–2982.
- Pinkus, A. Approximation theory of the MLP model in neural networks. Acta Numerica 8 (1 1999), 143–195.
- Popa, C.-A. Octonion-valued neural networks. In International Conference on Artificial Neural Networks (2016), Springer, pp. 435–443.
- RD, S. On the algebras formed by the cayley-dickson process. American Journal of Mathematics 76, 2 (1954), 435–46.
- Metacognitive Octonion-Valued Neural Networks as They Relate to Time Series Analysis. IEEE Transactions on Neural Networks and Learning Systems 31, 2 (2 2020), 539–548.
- Schafer, R. An Introduction to Nonassociative Algebras. Project Gutenberg, 1961.
- Tessarine and Quaternion-Valued Deep Neural Networks for Image Classification. Anais do Encontro Nacional de Inteligência Artificial e Computacional (ENIAC) (11 2021), 350–361.
- Quaternion Neural-Network-Based PolSAR Land Classification in Poincare-Sphere-Parameter Space. IEEE Transactions on Geoscience and Remote Sensing 52 (2014), 5693–5703.
- Approximation rates for neural networks with general activation functions. Neural Networks 128 (2020), 313–321.
- Takahashi, K. Comparison of high-dimensional neural networks using hypercomplex numbers in a robot manipulator control. Artificial Life and Robotics 26, 3 (8 2021), 367–377.
- Valle, M. E. Understanding vector-valued neural networks and their relationship with real and hypercomplex-valued neural networks, 2023.
- An Introduction to Clifford Algebras and Spinors. Oxford University Press, 2016.
- Acute Lymphoblastic Leukemia Detection Using Hypercomplex-Valued Convolutional Neural Networks. In 2022 International Joint Conference on Neural Networks (IJCNN) (7 2022), IEEE, pp. 1–8.
- A general framework for hypercomplex-valued extreme learning machines. Journal of Computational Mathematics and Data Science 3 (2022), 100032.
- Extending the Universal Approximation Theorem for a Broad Class of Hypercomplex-Valued Neural Networks. Lecture Notes in Computer Science 13654 LNAI (2022), 646–660.
- Voigtlaender, F. The universal approximation theorem for complex-valued neural networks. Applied and Computational Harmonic Analysis 64 (5 2023), 33–61.
- Deep octonion networks. Neurocomputing 397 (7 2020), 179–191.