Papers
Topics
Authors
Recent
2000 character limit reached

Koopman Embeddings for Nonlinear Dynamics

Updated 14 December 2025
  • Koopman embeddings are data-driven coordinate transformations that recast nonlinear dynamical systems into an approximately linear regime using Koopman operators.
  • They utilize mathematical constructs like eigenfunctions and algorithmic tools such as DMD, EDMD, kernel methods, and deep autoencoders to derive finite-dimensional approximations.
  • These methods enable robust prediction, control, and modal analysis across diverse applications, despite challenges in handling multiple attractors and continuous spectra.

Koopman embeddings are data-driven coordinate transformations that recast the evolution of nonlinear dynamical systems into an approximately linear regime through the action of Koopman operators—linear but generally infinite-dimensional objects that act on observables of the system state. This perspective allows nonlinear dynamics to be analyzed, predicted, and controlled using linear techniques, provided suitable finite-dimensional embeddings (often constructed as nonlinear functions or neural network outputs) can be identified such that evolution in those coordinates is well-approximated by a linear operator. Rigorous theory establishes the connection between operator-theoretic, geometric, and control-theoretic aspects of dynamical systems, while a rapidly developing suite of algorithmic tools—ranging from Dynamic Mode Decomposition (DMD) and Extended DMD (EDMD) to modern deep autoencoder architectures—enables practical construction of Koopman embeddings from finite trajectory data (Brunton et al., 2021).

1. Mathematical Foundations of Koopman Embeddings

Given a dynamical system xk+1=F(xk)x_{k+1}=F(x_k) (discrete time), the Koopman operator K\mathcal{K} acts linearly on observables g:RnCg:\mathbb{R}^n\rightarrow\mathbb{C} by composition: Kg(x)=g(F(x))\mathcal{K}g(x) = g(F(x)). Koopman eigenfunctions φ\varphi satisfy Kφ(x)=λφ(x)\mathcal{K}\varphi(x) = \lambda\varphi(x), with λC\lambda\in\mathbb{C}. If a finite set of such eigenfunctions {φ1,,φr}\{\varphi_1,\ldots,\varphi_r\} is found, the embedded coordinate z=[φ1(x),,φr(x)]z = [\varphi_1(x),\ldots,\varphi_r(x)]^\top evolves linearly: zk+1=Kzkz_{k+1} = K z_k, with K=diag(λ1,,λr)K = \mathrm{diag}(\lambda_1, \ldots, \lambda_r). Observables gg that lie in the span of these eigenfunctions follow g(xk)=vzk=jvjλjkφj(x0)g(x_k) = v^\top z_k = \sum_{j} v_j \lambda_j^k \varphi_j(x_0) (Brunton et al., 2021).

The embedding challenge centers on constructing finite sets of functions that span approximately invariant subspaces for K\mathcal K, yielding linear finite-dimensional evolution in zz-space (Brunton et al., 2021).

2. Algorithms for Constructing Koopman Embeddings

2.1 Linear and Kernel-Based Approaches

  • Dynamic Mode Decomposition (DMD): Based on snapshot pairs and SVD reduction, DMD yields eigenvalues and modes approximating the spectrum and invariant features of the Koopman operator (Brunton et al., 2021).
  • Extended DMD (EDMD): Generalizes DMD by introducing a flexible, user-defined dictionary of basis functions, enabling richer nonlinear embeddings (Brunton et al., 2021).
  • Kernel Methods: Kernel-based algorithms approximate K\mathcal{K} in an RKHS, using the kernel trick to avoid explicit high-dimensional expansions. Eigenfunctions approximate Koopman modes as sums over kernel evaluations (Brunton et al., 2021, Hou et al., 27 Jan 2025).

2.2 Deep Learning Strategies

  • Autoencoder-Based Embeddings: Deep encoders ϕθ\phi_\theta map raw state xx (or time-delay vectors) into a latent space zz, where the operator KK models linear (or locally linear) dynamics: zk+1=Kzkz_{k+1} = K z_k (Lusch et al., 2017, Jayarathne et al., 2023). The decoder reconstructs xx from zz.
  • “Lift and Regress”: A feed-forward encoder produces high-dimensional hidden coordinates z=ϕθ(x)z=\phi_\theta(x), and KK is fitted by minimizing ϕ(xk+1)Kϕ(xk)2\|\phi(x_{k+1}) - K\phi(x_k)\|^2, favoring linearization in latent space (Brunton et al., 2021).
  • Auxiliary Networks for Continuous Spectra: For systems with non-isolated frequencies (e.g., undamped pendulum), auxiliary networks parameterize frequency and growth rate as functions of latent state radius, enabling efficient, compact embeddings for continuous Koopman spectra (Lusch et al., 2017).
  • Message-Passing Networks for Graph Dynamics: Exploiting underlying geometric structure via graph-aware neural networks yields globally valid linearizations of large-scale network dynamics (Yeh et al., 2023).

2.3 Algorithmic Pipeline (Standard Structure)

Algorithm Encoder Operator Decoder Embedding Type
DMD Linear projection Least-squares Linear Modal
EDMD Dictionary basis Least-squares Linear Modal/nonlinear
Kernel DMD Kernel functions Kernel regression Linear Nonparametric
Deep Koopman AE Deep NN (MLP) Matrix (learned) Deep NN (MLP) Nonlinear
Koopman MPNN Graph MPNN Diagonal MPNN Graph-structured

3. Theoretical Guarantees, Limitations, and Spectral Structure

The convergence of EDMD-type algorithms is established for ergodic dynamics and invariant dictionaries: KPDKPDK \rightarrow P_{\mathcal{D}} \mathcal{K} P_{\mathcal{D}} as mm \rightarrow \infty; in L2L^2 the approximation converges strongly to K\mathcal{K} (Brunton et al., 2021). Koopman embedding theory is tightly connected to contraction analysis and Lyapunov stability: every discrete-time contracting nonlinear model admits a finite-dimensional stable Koopman embedding, proved via direct parameterization of stable KK (Fan et al., 2021).

Koopman operators may exhibit pure point, continuous, and residual spectra. Classical algorithms target point spectrum; continuous-spectrum systems (mixing or quasi-periodic) are more challenging, necessitating harmonic averaging or continuous auxiliary networks for accurate embedding (Brunton et al., 2021, Lusch et al., 2017).

A significant limitation is the non-existence of one-to-one continuous finite-dimensional Koopman immersions for nonlinear systems with multiple omega-limit sets: any continuous embedding must collapse all limit sets in the image, and learned approximate immersions exhibit the same failure in the regime of dense data and small sampling interval (Liu et al., 2023). This fundamental obstruction shapes the choice of embedding domain and the practical utility of the representations.

4. Extensions for Systems with Inputs, Control, and Hybrid Modeling

Koopman embeddings have been extended to nonlinear systems with inputs by lifting to linear parameter-varying (LPV) forms: under mild differentiability and convexity conditions, the lifted dynamics take zk+1=Azk+B(zk,uk)ukz_{k+1} = A z_k + B(z_k, u_k)u_k, with BB state- or input-dependent (Iacob et al., 2022). For control-affine systems, bilinear or affine models emerge, and error bounds quantify the approximation quality of using a constant BB in linear time-invariant (LTI) Koopman models.

Delay-coordinate embeddings (Hankel maps) enable linear modeling of non-smooth periodic or hybrid systems by stacking current and history states, so that Xk+1=AXk+BUkX_{k+1} = A X_k + B U_k, with XkX_k the time-delay vector (Yang et al., 19 Jul 2025). This approach remains valid as long as the system's modal sequence and event timing are periodic and consistent. It facilitates history-augmented control synthesis through LQR in the lifted space.

For block-oriented polynomial systems, exact finite-dimensional Koopman embeddings can be constructed by stacking Kronecker monomials, yielding polynomial-input time-invariant lifted models, which reduce to bilinear form under no-feedthrough conditions (Iacob et al., 20 Jul 2025).

5. Structure-Preserving Embeddings, Stability, and Hamiltonian Systems

Recent advances in deep learning architectures incorporate additional geometric and stability constraints. For canonical nonlinear Hamiltonian systems, embeddings are sought that preserve symplectic structure via explicit Jacobian penalties in the loss function: (Dφ(x))J2mDφ(x)=J2n(D\varphi(x))^\top J_{2m} D\varphi(x) = J_{2n} (Goyal et al., 2023). Cubicization (embedding via higher-degree monomials) overcomes limitations for systems with continuous Koopman spectra (e.g., nonlinear pendulum), enabling long-term stable prediction with sum-of-squares parameterization of the learned Hamiltonian to guarantee boundedness.

A key theoretical result is that convex parameterizations directly enforce stability; all Schur-stable linear operators KK can be parameterized without explicit inequality constraints (Fan et al., 2021).

6. Learning Strategies, Robustness, and Online Adaptation

Hybrid frameworks combine semidefinite programming (SDP) to determine the minimal latent dimension and memory depth of an approximate Koopman invariant subspace, followed by deep autoencoder training that refines the mapping and enables explicit coordinate reconstruction (Estornell et al., 25 Apr 2025). Conformal online learning (COLoKe) uses a prediction-conformity mechanism to trigger adaptive updates only when model residual exceeds a dynamically tuned threshold, thereby limiting unnecessary retraining and avoiding overfitting in streaming and nonstationary contexts (Gao et al., 16 Nov 2025).

Nonparametric kernel approaches (operator stochastic approximation + online compression) enable memory-efficient, consistent learning of the Koopman operator directly in RKHS, with finite-sample error bounds and asymptotic guarantees (Hou et al., 27 Jan 2025).

7. Applications and Empirical Performance

A broad range of practical applications demonstrate the utility of Koopman embeddings:

  • Modal analysis and coherent structure identification in fluid flows via DMD/EDMD (Brunton et al., 2021).
  • Prediction and control of nonlinear oscillators and hybrid systems, including Duffing, Van der Pol, Lotka–Volterra, and robotic locomotion (Brunton et al., 2021, Yang et al., 19 Jul 2025).
  • Networked dynamical systems and neural networks via graph-message-passing Koopman encoders for global linearization (Yeh et al., 2023).
  • Trajectory planning for robotic manipulators in dynamic environments: deep Koopman embeddings enable real-time, multi-step collision-aware rollout in motion planners (Chen et al., 5 Jul 2025).
  • Structure-preserving embeddings for Hamiltonian PDEs: cubicized autoencoders outperform linear/quadratic embeddings in long-term simulation accuracy (Goyal et al., 2023).
  • Quantum algorithms for differential equations: Koopman–von Neumann embedding maps suitable linear ODEs into Schrödinger-type equations for efficient quantum simulation, subject to strict spectral conditions (Ito et al., 2023).
  • Generalized multi-task representations: Koopman embeddings capture reusable dynamical structure that transfers across prediction and control tasks, as demonstrated in Lorenz system experiments (Hjikakou et al., 26 Aug 2025).

Across these domains, empirical studies show that modern Koopman embedding methods yield substantial gains in prediction accuracy, data efficiency, and robustness compared to classical linearization and baseline neural methods (Estornell et al., 25 Apr 2025, Gao et al., 16 Nov 2025, Yeh et al., 2023, Hjikakou et al., 26 Aug 2025).

8. Open Challenges, Limitations, and Future Directions

While progress is rapid, central challenges remain. These include:

  • Existence and scope: Finite-dimensional, globally one-to-one continuous embeddings are generically impossible for nonlinear systems with multiple basins of attraction; the collapse of omega-limit sets in learned embeddings is now rigorously established (Liu et al., 2023). This motivates chart-based or discontinuous (piecewise) embeddings and greater use of infinite-dimensional dictionaries (kernel, Hilbert-space methods).
  • Embedding selection: The quality of dictionaries/basis functions critically affects the fidelity and generalizability of EDMD/Koopman AEs—overfitting and regularization remain active areas of paper (Brunton et al., 2021, Jayarathne et al., 2023).
  • Spectral and control-theoretic extension: The design of embeddings for input–affine, hybrid, or high-dimensional systems requires further advances in parameter-varying and bilinear formulations, as well as integration with robust control synthesis (Iacob et al., 2022, Iacob et al., 20 Jul 2025).
  • Noise resilience and scalability: Deep methods for denoising and high-dimensional PDEs, automated hyperparameter tuning, and robust online updating are under development (Goyal et al., 2023, Gao et al., 16 Nov 2025, Hou et al., 27 Jan 2025).
  • Quantum and computational complexity: Understanding the spectral conditions and complexity theory underlying Koopman–von Neumann embeddings for classical-to-quantum mapping opens new domains for computational dynamical systems (Ito et al., 2023).

The field continues to advance rapidly on the interplay of operator theory, machine learning, nonlinear dynamics, and computational geometry, with Koopman embeddings at the interface of theoretical insight and practical modeling power.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (16)

Whiteboard

Follow Topic

Get notified by email when new papers are published related to Koopman Embeddings.