Deep Latent Force Models: ODE-based Process Convolutions for Bayesian Deep Learning
Abstract: Modelling the behaviour of highly nonlinear dynamical systems with robust uncertainty quantification is a challenging task which typically requires approaches specifically designed to address the problem at hand. We introduce a domain-agnostic model to address this issue termed the deep latent force model (DLFM), a deep Gaussian process with physics-informed kernels at each layer, derived from ordinary differential equations using the framework of process convolutions. Two distinct formulations of the DLFM are presented which utilise weight-space and variational inducing points-based Gaussian process approximations, both of which are amenable to doubly stochastic variational inference. We present empirical evidence of the capability of the DLFM to capture the dynamics present in highly nonlinear real-world multi-output time series data. Additionally, we find that the DLFM is capable of achieving comparable performance to a range of non-physics-informed probabilistic models on benchmark univariate regression tasks. We also empirically assess the negative impact of the inducing points framework on the extrapolation capabilities of LFM-based models.
- Latent force models. In Artificial Intelligence and Statistics, pages 9–16. PMLR, 2009.
- Computationally efficient convolved multiple output Gaussian processes. The Journal of Machine Learning Research, 12:1459–1500, 2011.
- Kernels for vector-valued functions: A review. Foundations and Trends® in Machine Learning, 4(3):195–266, 2012.
- Non-linear process convolutions for multi-output Gaussian processes. In The 22nd International Conference on Artificial Intelligence and Statistics, pages 1969–1977. PMLR, 2019.
- Representation learning: A review and new perspectives. IEEE Transactions on Pattern Analysis and Machine Intelligence, 35(8):1798–1828, 2013.
- Deep convolutional Gaussian processes. In Machine Learning and Knowledge Discovery in Databases: European Conference, ECML PKDD 2019, Würzburg, Germany, September 16–20, 2019, Proceedings, Part II, pages 582–597. Springer, 2020.
- Tree-structured Gaussian process approximations. Advances in Neural Information Processing Systems, 27, 2014.
- Gaussian process optimization with adaptive sketching: Scalable and no regret. In Conference on Learning Theory, pages 533–557. PMLR, 2019.
- Random feature expansions for deep Gaussian processes. In International Conference on Machine Learning, pages 884–893. PMLR, 2017.
- Andreas Damianou. Deep Gaussian processes and variational propagation of uncertainty. PhD thesis, University of Sheffield, 2015.
- Deep Gaussian processes. In Artificial intelligence and statistics, pages 207–215. PMLR, 2013.
- UCI machine learning repository, 2017. URL http://archive.ics.uci.edu/ml.
- How deep are deep Gaussian processes? Journal of Machine Learning Research, 19(54):1–46, 2018.
- Avoiding pathologies in very deep networks. In Artificial Intelligence and Statistics, pages 202–210. PMLR, 2014.
- GPyTorch: Blackbox matrix-matrix Gaussian process inference with GPU acceleration. Advances in Neural Information Processing Systems, 31:7576–7586, 2018.
- Variational inference for nonlinear ordinary differential equations. In International Conference on Artificial Intelligence and Statistics, pages 2719–2727. PMLR, 2021.
- On the inability of Gaussian process regression to optimally learn compositional functions. Advances in Neural Information Processing Systems, 35:22341–22353, 2022.
- PhysioBank, PhysioToolkit, and PhysioNet: components of a new research resource for complex physiologic signals. circulation, 101(23):e215–e220, 2000.
- Fast kernel approximations for latent force models and convolved multiple-output Gaussian processes. In Uncertainty in Artificial Intelligence: Proceedings of the Thirty-Fourth Conference (2018). AUAI Press, 2018.
- Inference in deep Gaussian processes using stochastic gradient Hamiltonian Monte Carlo. Advances in Neural Information Processing Systems, 2018:7506–7516, 2018.
- Gaussian processes for big data. In Proceedings of the Twenty-Ninth Conference on Uncertainty in Artificial Intelligence, pages 282–290, 2013.
- Stochastic variational inference. Journal of Machine Learning Research, 14(5), 2013.
- Trending autoregulatory indices during treatment for traumatic brain injury. Journal of clinical monitoring and computing, 30(6):821–831, 2016.
- Auto-encoding variational Bayes. In 2nd International Conference on Learning Representations, ICLR 2014, Banff, AB, Canada, April 14-16, 2014, Conference Track Proceedings, 2014. URL http://arxiv.org/abs/1312.6114.
- Variational dropout and the local reparameterization trick. Advances in Neural Information Processing Systems, 28, 2015.
- Deep learning. nature, 521(7553):436–444, 2015.
- Constraining the dynamics of deep probabilistic models. In International Conference on Machine Learning, pages 3227–3236. PMLR, 2018.
- Decoupled weight decay regularization. In International Conference on Learning Representations, 2018.
- Compositional modeling of nonlinear dynamical systems with ODE-based random features. Advances in Neural Information Processing Systems, 34:13809–13819, 2021.
- Shallow and deep nonparametric convolutions for Gaussian processes. arXiv preprint arXiv:2206.08972, 2022.
- Nonparametric Gaussian process covariances via multidimensional convolutions. In International Conference on Artificial Intelligence and Statistics, pages 8279–8293. PMLR, 2023.
- Efficient amortised Bayesian inference for hierarchical and nonlinear dynamical systems. In International Conference on Machine Learning, pages 4445–4455. PMLR, 2019.
- Approximate latent force model inference. arXiv preprint arXiv:2109.11851, 2021.
- Random features for large-scale kernel machines. Advances in neural information processing systems, 20, 2007.
- Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations. Journal of Computational physics, 378:686–707, 2019.
- Gaussian Processes for Machine Learning. MIT Press, 2006. URL http://www.gaussianprocess.org/gpml/.
- Learning nonparametric Volterra kernels with Gaussian processes. Advances in Neural Information Processing Systems, 34:24099–24110, 2021.
- Inter-domain deep Gaussian processes. In International Conference on Machine Learning, pages 8286–8294. PMLR, 2020.
- H Salimbeni and MP Deisenroth. Doubly stochastic variational inference for deep Gaussian processes. In NIPS, volume 31, pages 4591–4602. Neural Information Processing Systems (NIPS), 2017.
- Deep Gaussian processes with importance-weighted variational inference. In International Conference on Machine Learning, pages 5589–5598. PMLR, 2019.
- Michalis Titsias. Variational learning of inducing variables in sparse Gaussian processes. In Artificial intelligence and statistics, pages 567–574. PMLR, 2009.
- Learning stationary time series using Gaussian processes with nonparametric kernels. Advances in neural information processing systems, 28, 2015.
- Physics informed deep kernel learning. In International Conference on Artificial Intelligence and Statistics, pages 1206–1218. PMLR, 2022.
- Black-box inference for non-linear latent force models. In International Conference on Artificial Intelligence and Statistics, pages 3088–3098. PMLR, 2020.
- Integrating physics-based modeling with machine learning: A survey. arXiv preprint arXiv:2003.04919, 1(1):1–34, 2020.
- Efficiently sampling functions from Gaussian process posteriors. In International Conference on Machine Learning, pages 10292–10302. PMLR, 2020.
- Implicit posterior variational inference for deep Gaussian processes. arXiv preprint arXiv:1910.11998, 2019.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.