Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
134 tokens/sec
GPT-4o
10 tokens/sec
Gemini 2.5 Pro Pro
47 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Deep Generative Modeling for Identification of Noisy, Non-Stationary Dynamical Systems (2410.02079v1)

Published 2 Oct 2024 in cs.LG and q-bio.QM

Abstract: A significant challenge in many fields of science and engineering is making sense of time-dependent measurement data by recovering governing equations in the form of differential equations. We focus on finding parsimonious ordinary differential equation (ODE) models for nonlinear, noisy, and non-autonomous dynamical systems and propose a machine learning method for data-driven system identification. While many methods tackle noisy and limited data, non-stationarity - where differential equation parameters change over time - has received less attention. Our method, dynamic SINDy, combines variational inference with SINDy (sparse identification of nonlinear dynamics) to model time-varying coefficients of sparse ODEs. This framework allows for uncertainty quantification of ODE coefficients, expanding on previous methods for autonomous systems. These coefficients are then interpreted as latent variables and added to the system to obtain an autonomous dynamical model. We validate our approach using synthetic data, including nonlinear oscillators and the Lorenz system, and apply it to neuronal activity data from C. elegans. Dynamic SINDy uncovers a global nonlinear model, showing it can handle real, noisy, and chaotic datasets. We aim to apply our method to a variety of problems, specifically dynamic systems with complex time-dependent parameters.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (60)
  1. A dynamical simulation facility for hybrid systems. Hybrid systems (eds RL Grossman, A Nerode, AP Ravn, H Rischel), pp. 255–267. Berlin, Germany: Springer., 1993.
  2. Z-forcing: Training stochastic recurrent networks. dvances in Neural Information Processing Systems (NeurIPS). Long Beach, CA, 2017.
  3. Temporal clustering for order reduction of nonlinear parabolic pde systems with time-dependent spatial domains: application to a hydraulic fracturing. AlChE J. 63, 3818–3831., 2017.
  4. Handling spatial heterogeneity in reservoir parameters using proper orthogonal decomposition based ensemble kalman filter for model-based feedback control of hydraulic fracturing. Ind. Eng. Chem. Res. 57, 3977–3989, 2018.
  5. Van Der Schaft AJ and Schumacher JM. An introduction to hybrid dynamical systems, vol. 251. London, UK: Springer., 2000.
  6. Molecular and sensory basis of a food related two-state behavior in c. elegans. PLoS One, 4(10):e7584, 2009.
  7. The control structure of the nematode caenorhabditis elegans: Neuro-sensory integration and proprioceptive feedback. J. Biomech., 74:1–8, 2018.
  8. Learning sparse neural networks through l0subscript𝑙0l_{0}italic_l start_POSTSUBSCRIPT 0 end_POSTSUBSCRIPT regularization. 2018.
  9. State esti- mation for discrete systems with switching parameters. IEEE Transactions on Automatic Control, 15(1):10–17, 1978.
  10. Nonlinear model order reduction based on local reduced-order bases. Int. J. Numer. Methods Eng. 27, 148–153., 2012.
  11. Timevae: A variational auto-encoder for multivariate time series generation, 2021.
  12. Stochastic backpropagation and approximate inference in deep generative models. International Conference on Machine Learning, 2014.
  13. Kaiser E et al. Cluster-based reduced-order modelling of a mixing layer. J. Fluid Mech. 754, 365–414., 2014.
  14. Exact recovery of chaotic systems from highly corrupted data. Multiscale Model. Simul. 15, 1108–1129., 2017.
  15. Ackerson GA and Fu KS. On state estima- tion in switching environments. IEEE Transactions on Automatic Control, 15(1):10–17, 1970.
  16. Time series analysis: forecasting and control. 2015 New York, NY: Wiley, 2017.
  17. Communication infrastructure design in cyber physical systems with applications in smart grids: a hybrid system framework. IEEE Commun. Surv. Tutor. 16, 1689–1708., 2014.
  18. Schaeffer H. Learning partial differential equations via data discovery and sparse optimization. Proc. R. Soc. A 473, 20160446, 2017.
  19. Learning dynamical systems and bifurcation via group sparsity, 2013.
  20. Sparse model selection via integral terms. Phys. Rev. E 96, 023302, 2017.
  21. Complex systems analysis of series of blackouts: cascading failure, critical points, and self-organization. Chaos 17, 026103, 2007.
  22. Learning stochastic recurrent networks. 2014.
  23. A recurrent latent variable model for sequential data. Advances in Neural Information Processing Systems (NeurIPS). Montréal, Canada, 2015.
  24. Time-series generative adversarial networks. In Wallach H, Larochelle H, Beygelzimer A, d'Alché-Buc F, Fox E, and Garnett R, editors, Advances in Neural Information Processing Systems, volume 32. Curran Associates, Inc., 2019.
  25. Dynamic mode decomposition with control. SIAM J. Appl. Dyn. Syst., S 15:142–161, 2016.
  26. Dynamical Variational Autoencoders: A Comprehensive Review. 2021.
  27. Dynamic predictive coding: A model of hierarchical sequence learning and prediction in the neocortex. PLoS Comput Biology;20(2):e1011801, 2024.
  28. Dihlmann M. Model reduction of parametrized evolution problems using the reduced basis method with adaptive time partitioning. In Proc. of ADMOS, Paris, France, p. 64., 2011.
  29. A disentangled recognition and nonlinear dynamics model for unsupervised learning. Advances in Neural Information Processing Systems (NeurIPS). Long Beach, CA, 2017.
  30. Sequential neural models with stochastic layers. Advances in Neural Information Processing Systems (NeurIPS). Barcelona, Spain, 2016.
  31. Mode decomposition methods for flows in high-contrast porous media. local-global approach. J. Comput. Phys. 253, 226–238., 2013.
  32. Hypersindy: Deep generative modeling of nonlinear stochastic governing equations, 2023.
  33. Nonlinear control in the nematode c. elegans. Frontiers in Computational Neuroscience, 14, 2021.
  34. Seasonally forced disease dynamics explored as switching between attractors. Physica D 148, 317–335, 2001.
  35. Kwon NA. Development of local dynamic mode decomposition with control: application to model predictive control of hydraulic fracturing. Comput. Chem. Eng. 106, 501–511., 2017.
  36. Inferring biological networks by sparse identification of nonlinear dynamics. IEEE Trans. Mol. Biol. Multi-Scale Commun. 2, 52–63, 2016.
  37. Model selection for hybrid dynamical systems via sparse regression. Proc. R. Soc. A., 2019.
  38. Reconstruction of normal forms by learning informed observation geometries from data. Proc. Natl Acad. Sci. USA 114, E7865–E7874., 2018.
  39. The dynamics of legged locomotion: models, analyses, and challenges. SIAM Rev. 48, 207–304, 2006.
  40. Auto-encoding variational bayes, 2013.
  41. An introduction to variational autoencoders. Foundations and Trends in Machine Learning: Vol. 12: No. 4, pp 307-392, 2019.
  42. The second wave of synthetic biology: from modules to systems. Nat. Rev. Mol. Cell Biol., 10:410–422, 2009.
  43. Deep kalman filters. 2015.
  44. AAAI Conference on Artificial Intelligence, San Francisco, CA, 2017.
  45. Global brain dynamics embed the motor command sequence of caenorhabditis elegans. Cell, 163(3):656–669, 2015.
  46. “a variance modeling framework based on variational autoencoders for speech enhancement. IEEE International Workshop on Machine Learning for Signal Processing (MLSP), Aalborg, Denmark, 2018.
  47. Linderman S. recurrent-slds. https://github.com/slinderman/recurrent-slds, 2016.
  48. Hierarchical recurrent state space models reveal discrete and continuous dynamics of neural activity in c. elegans, 2019.
  49. Bayesian learning and inference in recurrent switching linear dynamical systems. In Proc. of the 20th Int. Conf. on Artificial Intelligence and Statistics, vol. 54 (eds A Singh, J Zhu), Proc. of Machine Learning Research, pp. 914–922. Fort Lauderdale, FL: JLMR: W&CP., 2017.
  50. Macéšic Ś. Koopman operator family spectrum for nonautonomous systems-part 1., 2017.
  51. Data-driven discovery of partial differential equations. Sci. Adv. 3, e1602614., 2017.
  52. Discovering governing equations from data by sparse identification of nonlinear dynamical systems. Proceedings of the National Academy of Sciences (PNAS), 2016.
  53. Serotonin and the neuropeptide pdf initiate and extend opposing behavioral states in c. elegans. Cell, 154:1023–1035, 2013.
  54. The geometry of locomotive behavioral states in c. elegans. PLoS ONE, 8:e59865, 2013.
  55. Neurons regulating the duration of forward locomotion in caenorhabditis elegans. Neurosci. Res., 50:103–111, 2004.
  56. Stochasticity in transcriptional regulation: origins, consequences, and mathematical representations. Biophys. J., 81:3116–3136, 2001.
  57. Construction of a genetic toggle switch in escherichia coli. Nature, 403:339–342, 2000.
  58. A stochastic neuronal model predicts random search behaviors at multiple spatial scales in c. elegans. eLife, 5:e12572, 2016.
  59. Li Y and Mandt S. Disentangled sequential autoencoder. International Conference on Machine Learning (ICML). Stockholm, Sweden, 2018.
  60. A unified approach for sparse dynamical system inference from temporal measurements, 2017.

Summary

We haven't generated a summary for this paper yet.