Papers
Topics
Authors
Recent
Search
2000 character limit reached

A universal reproducing kernel Hilbert space for learning nonlinear systems operators

Published 24 Dec 2024 in math.OC, eess.SY, and math.DS | (2412.18360v1)

Abstract: In this work, we consider the problem of learning nonlinear operators that correspond to discrete-time nonlinear dynamical systems with inputs. Given an initial state and a finite input trajectory, such operators yield a finite output trajectory compatible with the system dynamics. Inspired by the universal approximation theorem of operators tailored to radial basis functions neural networks, we construct a class of kernel functions as the product of kernel functions in the space of input trajectories and initial states, respectively. We prove that for positive definite kernel functions, the resulting product reproducing kernel Hilbert space is dense and even complete in the space of nonlinear systems operators, under suitable assumptions. This provides a universal kernel-functions-based framework for learning nonlinear systems operators, which is intuitive and easy to apply to general nonlinear systems.

Summary

  • The paper presents a universal product RKHS that can approximate any nonlinear system operator under suitable conditions.
  • It leverages the universal approximation theorem and RBF kernels to form a Gram matrix with full rank for unique solutions.
  • The framework demonstrates computational efficiency and scalability in large-scale nonlinear dynamics, validated on the Van der Pol oscillator.

A Universal Reproducing Kernel Hilbert Space for Learning Nonlinear Systems Operators

Introduction

The paper "A Universal Reproducing Kernel Hilbert Space for Learning Nonlinear Systems Operators" (2412.18360) addresses the challenge of learning nonlinear operators that define discrete-time nonlinear dynamical systems with inputs. The authors propose a class of kernel functions designed as products of kernel functions over input trajectories and initial states. These products form a novel reproducing kernel Hilbert space (RKHS) that is dense and complete under suitable conditions, providing a systematic framework for learning nonlinear system operators.

Theoretical Framework

The primary objective is to construct an RKHS that can universally approximate nonlinear system operators. The approach leverages the universal approximation theorem for radial basis function (RBF) neural networks, extending it to define kernel products. The proposed product kernel function, k⊗k_\otimes, defined over the Cartesian product of input and state spaces, leads to a RKHS H(k⊗,U×X)H(k_\otimes, U \times X). Positive definite kernel functions ensure that the corresponding product RKHS is dense and complete in the space of nonlinear systems operators.

The universal approximation theorem, integrated with RBFs, underpins the construction of the Gram matrix in RKHS. Specifically, for radial basis functions that are not even polynomials, it shows that these functions can serve as kernels to ensure universality. This leads to a product Gram matrix K⊗K_\otimes, whose full rank assures unique solutions in the RKHS, thereby supporting accurate operator learning.

Product Reproducing Kernel Hilbert Spaces

The product RKHS H(k⊗,U×X)H(k_\otimes, U \times X) is formed using kernel functions kuk_u and kxk_x applied to inputs and states, respectively. The paper demonstrates that the product of positive definite kernels forms a RKHS that is both dense and complete for operator learning tasks. The denseness ensures that for any given system operator, an approximation within any desired accuracy can be found in the RKHS. Completeness guarantees that the orthonormal system covers the entire function space.

Practical Implications and Efficiency

From a practical standpoint, the proposed RKHS framework offers computational efficiency by scaling well with the number of data points involved. This efficiency is crucial in applications involving large data sets typical to nonlinear system dynamics. Unlike standard kernel methods that may face computational limits, the product RKHS's Kronecker product structure allows for computationally feasible solutions.

Simulation and Examples

The paper illustrates the efficacy of the product RKHS through a detailed example involving the Van der Pol oscillator. The learning framework is compared with standard RKHS methods, demonstrating superior scalability and prediction accuracy. The results accentuate the framework’s capability to handle complex nonlinear dynamics effectively without an excessive computational burden.

Conclusion

In conclusion, this research delineates a comprehensive and efficient framework for learning nonlinear operators in discrete-time dynamical systems. Through the strategic use of product kernels, it establishes a universal RKHS that is both dense and complete, offering promising avenues for data-driven learning and control in nonlinear systems. Future work is anticipated to extend these findings towards continuous-time systems and explore the interplay between the proposed RKHS structure and the Koopman operator theory, potentially impacting fundamental understandings in data-driven control and predictive methods.

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Authors (1)

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 1 tweet with 12 likes about this paper.