Papers
Topics
Authors
Recent
Search
2000 character limit reached

Exact Inference for Continuous-Time Gaussian Process Dynamics

Published 5 Sep 2023 in cs.LG and stat.ML | (2309.02351v2)

Abstract: Physical systems can often be described via a continuous-time dynamical system. In practice, the true system is often unknown and has to be learned from measurement data. Since data is typically collected in discrete time, e.g. by sensors, most methods in Gaussian process (GP) dynamics model learning are trained on one-step ahead predictions. This can become problematic in several scenarios, e.g. if measurements are provided at irregularly-sampled time steps or physical system properties have to be conserved. Thus, we aim for a GP model of the true continuous-time dynamics. Higher-order numerical integrators provide the necessary tools to address this problem by discretizing the dynamics function with arbitrary accuracy. Many higher-order integrators require dynamics evaluations at intermediate time steps making exact GP inference intractable. In previous work, this problem is often tackled by approximating the GP posterior with variational inference. However, exact GP inference is preferable in many scenarios, e.g. due to its mathematical guarantees. In order to make direct inference tractable, we propose to leverage multistep and Taylor integrators. We demonstrate how to derive flexible inference schemes for these types of integrators. Further, we derive tailored sampling schemes that allow to draw consistent dynamics functions from the learned posterior. This is crucial to sample consistent predictions from the dynamics model. We demonstrate empirically and theoretically that our approach yields an accurate representation of the continuous-time system.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (34)
  1. Algorithm 924: TIDES, a Taylor Series Integrator for Differential Equations. ACM Trans. Math. Softw., 39(1).
  2. Structure-Preserving Learning Using Gaussian Processes and Variational Integrators. In Proceedings of The 4th Annual Learning for Dynamics and Control Conference, volume 168 of Proceedings of Machine Learning Research, 1150–1162. PMLR.
  3. Neural Ordinary Differential Equations. In Advances in Neural Information Processing Systems 31, 6571–6583.
  4. On Kernelized Multi-Armed Bandits. In Proceedings of the 34th International Conference on Machine Learning - Volume 70, 844–853.
  5. Cveticanin, L. 2013. On the Van Der Pol Oscillator: an Overview. In Acoustics and Vibration of Mechanical Structures, volume 430 of Applied Mechanics and Materials, 3–13.
  6. PILCO: A Model-Based and Data-Efficient Approach to Policy Search. In Proceedings of the 28th International Conference on International Conference on Machine Learning, 465–472.
  7. Taylor-Lagrange Neural Ordinary Differential Equations: Toward Fast Training and Evaluation of Neural ODEs. In Proceedings of the Thirty-First International Joint Conference on Artificial Intelligence. International Joint Conferences on Artificial Intelligence Organization.
  8. Probabilistic Recurrent State-Space Models. In Proceedings of the International Conference on Machine Learning (ICML).
  9. ODE parameter inference using adaptive gradient matching with Gaussian processes. In Sixteenth International Conference on Artificial Intelligence and Statistics; AISTATS.
  10. Structure-preserving Gaussian Process Dynamics. In Proceedings of the European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECML PKDD 2022).
  11. Practical and Rigorous Uncertainty Bounds for Gaussian Process Regression. Proceedings of the AAAI Conference on Artificial Intelligence, 35(8): 7439–7447.
  12. Solving Ordinary Differential Equations I – Nonstiff Problems. Springer.
  13. Variational multiple shooting for Bayesian ODEs with Gaussian processes. In Cussens, J.; and Zhang, K., eds., Proceedings of the Thirty-Eighth Conference on Uncertainty in Artificial Intelligence, volume 180 of Proceedings of Machine Learning Research, 790–799. PMLR.
  14. Learning nonparametric differential equations with operator-valued kernels and gradient matching. ArXiv preprint: arxiv.1411.5172.
  15. Learning unknown ODE models with Gaussian processes. In Proceedings of the 35th International Conference on Machine Learning.
  16. On Simulation and Trajectory Prediction with Gaussian Process Dynamics. In Bayen, A. M.; Jadbabaie, A.; Pappas, G.; Parrilo, P. A.; Recht, B.; Tomlin, C.; and Zeilinger, M., eds., Proceedings of the 2nd Conference on Learning for Dynamics and Control, volume 120, 424 – 434. Cambridge, MA: PMLR. 2nd Conference on Learning for Dynamics and Control (L4DC 2020) (virtual); Conference Location: Berkeley, CA, USA; Conference Date: June 11-12, 2020; Due to the Coronavirus (COVID-19) the conference was conducted virtually.
  17. Howarth, R. J. 1979. Mining Geostatistics. London & New York (Academic Press), 1978. Mineralogical Magazine, 43: 1–4.
  18. Discovery of Dynamics Using Linear Multistep Methods. SIAM Journal on Numerical Analysis, 59(1): 429–455.
  19. Ljung, L. 1999. System identification. Wiley encyclopedia of electrical and electronics engineering, 1–19.
  20. Gaussian Process Training with Input Noise. In Advances in Neural Information Processing Systems, volume 24.
  21. Model learning for robot control: A survey. Cognitive processing, 12: 319–40.
  22. ResNet After All: Neural ODEs and Their Numerical Solution. In 9th International Conference on Learning Representations, ICLR 2021, Virtual Event, Austria, May 3-7, 2021.
  23. Random Features for Large-Scale Kernel Machines. In Advances in Neural Information Processing Systems, volume 20.
  24. Numerical Gaussian Processes for Time-dependent and Non-linear Partial Differential Equations. arXiv preprint: arXiv.2211.11103.
  25. Multistep Neural Networks for Data-driven Discovery of Nonlinear Dynamical Systems. arXiv preprint: arXiv.1801.01236.
  26. Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning). The MIT Press.
  27. Symplectic Gaussian process regression of maps in Hamiltonian systems. Chaos: An Interdisciplinary Journal of Nonlinear Science, 31: 053121.
  28. Approximate Uncertainty Propagation for Continuous Gaussian Process Dynamical Systems. CoRR, arXiv preprint: arXiv.2211.11103.
  29. Distilling Free-Form Natural Laws from Experimental Data. Science, 324(5923): 81–85.
  30. Probabilistic Linear Multistep Methods. In Proceedings of the 30th International Conference on Neural Information Processing Systems, 4321–4328.
  31. Fast Gaussian Process Based Gradient Matching for Parameter Identification in Systems of Nonlinear ODEs. In Proceedings of the 22nd International Conference on Artificial Intelligence and Statistics (AISTATS), 1351–1360.
  32. Efficiently sampling functions from Gaussian process posteriors. In Proceedings of the 37th International Conference on Machine Learning, volume 119, 10292–10302.
  33. On Numerical Integration in Neural Ordinary Differential Equations. In Proceedings of the International Conference on Machine Learning.
  34. Error analysis based on inverse modified differential equations for discovery of dynamics using linear multistep methods and deep learning. arXiv preprint: arXiv.2209.12123.
Citations (1)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.