DynGMA: a robust approach for learning stochastic differential equations from data (2402.14475v2)
Abstract: Learning unknown stochastic differential equations (SDEs) from observed data is a significant and challenging task with applications in various fields. Current approaches often use neural networks to represent drift and diffusion functions, and construct likelihood-based loss by approximating the transition density to train these networks. However, these methods often rely on one-step stochastic numerical schemes, necessitating data with sufficiently high time resolution. In this paper, we introduce novel approximations to the transition density of the parameterized SDE: a Gaussian density approximation inspired by the random perturbation theory of dynamical systems, and its extension, the dynamical Gaussian mixture approximation (DynGMA). Benefiting from the robust density approximation, our method exhibits superior accuracy compared to baseline methods in learning the fully unknown drift and diffusion functions and computing the invariant distribution from trajectory data. And it is capable of handling trajectory data with low time resolution and variable, even uncontrollable, time step sizes, such as data generated from Gillespie's stochastic simulations. We then conduct several experiments across various scenarios to verify the advantages and robustness of the proposed method.
- Y. Aït-Sahalia. Maximum likelihood estimation of discretely sampled diffusions: a closed-form approximation approach. Econometrica, 70(1):223–262, 2002.
- I. Arasaratnam and S. Haykin. Cubature kalman filters. IEEE Transactions on automatic control, 54(6):1254–1269, 2009.
- On learning hamiltonian systems from data. Chaos: An Interdisciplinary Journal of Nonlinear Science, 29(12), 2019.
- Y. N. Blagoveshchenskii. Diffusion processes depending on a small parameter. Theory of Probability & Its Applications, 7(2):130–146, 1962.
- Certain properties of diffusion processes depending on a parameter. In Soviet Math. Dokl, volume 2, pages 633–636, 1961.
- J. Bongard and H. Lipson. Automated reverse engineering of nonlinear dynamical systems. Proceedings of the National Academy of Sciences, 104(24):9943–9948, 2007.
- M. W. Brandt and P. Santa-Clara. Simulated likelihood estimation of diffusions with an application to exchange rate dynamics in incomplete markets. Journal of financial economics, 63(2):161–210, 2002.
- Chaos as an intermittently forced linear system. Nature communications, 8(1):1–9, 2017.
- Stochastic climate dynamics: Random attractors and time-dependent invariant measures. Physica D: Nonlinear Phenomena, 240(21):1685–1700, 2011.
- Automated discovery of fundamental variables hidden in experimental data. Nature Computational Science, 2(7):433–442, 2022.
- Constructing custom thermodynamics using deep learning. Nature Computational Science, pages 1–20, 2023.
- Solving inverse stochastic problems from discrete particle observations using the fokker–planck equation and physics-informed neural networks. SIAM Journal on Scientific Computing, 43(3):B811–B830, 2021.
- Y. Chen and D. Xiu. Learning stochastic dynamical system via flow map operator. arXiv preprint arXiv:2305.03874, 2023.
- Learning effective stochastic differential equations from microscopic simulations: Linking stochastic numerics to deep learning. Chaos: An Interdisciplinary Journal of Nonlinear Science, 33(2), 2023.
- Hamiltonian neural networks. In 33rd Conference on Neural Information Processing Systems (NeurIPS 2019), pages 15353–15363, 2019.
- Stationary density estimation of itô diffusions using deep learning. SIAM Journal on Numerical Analysis, 61(1):45–82, 2023.
- Solving ordinary differential equations I: Nonstiff Problems. Springer Berlin, Heidelberg, 1993.
- A note on microlocal kernel design for some slow-fast stochastic differential equations with critical transitions and application to eeg signals. PHYSICA A-STATISTICAL MECHANICS AND ITS APPLICATIONS, 616, 2023.
- Emt and tumor metastasis. Clinical and translational medicine, 4(1):1–13, 2015.
- S. M. Iacus. Simulation and inference for stochastic differential equations: with R examples, volume 486. Springer, 2008.
- B. Jensen and R. Poulsen. Transition densities of diffusion processes: numerical comparison of approximation techniques. Journal of Derivatives, 9(4):18, 2002.
- M. Kessler. Estimation of an ergodic diffusion from discrete observations. Scandinavian Journal of Statistics, 24(2):211–229, 1997.
- A method for stochastic optimization. In International conference on learning representations (ICLR), volume 5, page 6. San Diego, California;, 2015.
- D. P. Kingma and J. Ba. Adam: A method for stochastic optimization. In 3rd International Conference on Learning Representations, 2015.
- H. Kushner. Approximations to optimal nonlinear filters. IEEE Transactions on Automatic Control, 12(5):546–556, 1967.
- C. Li and G. Balazsi. A landscape view on the interplay between emt and cancer metastasis. NPJ systems biology and applications, 4(1):34, 2018.
- Scalable gradients for stochastic differential equations. In The 23rd International Conference on Artificial Intelligence and Statistics, AISTATS 2020, volume 108 of Proceedings of Machine Learning Research, pages 3870–3882. PMLR, 2020.
- Computing high-dimensional invariant distributions from noisy data. Journal of Computational Physics, 474:111783, 2023.
- A. Look and M. Kandemir. Differential bayesian neural nets. In Proc. Adv. Neural Informat. Process. Syst. Workshop Bayesian Deep Learn., 2019., 2019.
- A deterministic approximation to neural sdes. IEEE Transactions on Pattern Analysis and Machine Intelligence, 45(4):4023–4037, 2023.
- E. N. Lorenz. Deterministic nonperiodic flow. Journal of atmospheric sciences, 20(2):130–141, 1963.
- “coarse” stability and bifurcation analysis using stochastic simulators: Kinetic monte carlo examples. The Journal of chemical physics, 116(23):10083–10091, 2002.
- J. McNamee and F. Stenger. Construction of fully symmetric numerical integration formulas of fully symmetric numerical integration formulas. Numerische Mathematik, 10:327–344, 1967.
- G. A. Pavliotis. ”Markov Processes and the Chapman–Kolmogorov Equation”. Stochastic processes and applications. Springer, 2016.
- A. R. Pedersen. A new approach to maximum likelihood estimation for stochastic differential equations based on discrete observations. Scandinavian journal of statistics, pages 55–71, 1995.
- B. Prakasa Rao. Statistical inference for diffusion type processes. Oxford University Press, New York, 1999.
- Data-driven discovery of partial differential equations. Science Advances, 3(4):e1602614, 2017.
- S. Särkkä and J. Sarmavuori. Gaussian filtering and smoothing for continuous-discrete dynamic systems. Signal Processing, 93(2):500–510, 2013.
- M. Schmidt and H. Lipson. Distilling free-form natural laws from experimental data. Science, 324(5923):81–85, 2009.
- Scalable inference in sdes by direct matching of the fokker–planck–kolmogorov equation. Advances in Neural Information Processing Systems, 34:417–429, 2021.
- Score-based generative modeling through stochastic differential equations. In 9th International Conference on Learning Representations, ICLR 2021, Virtual Event, Austria, May 3-7, 2021. OpenReview.net, 2021.
- C. Sparrow. The Lorenz equations: bifurcations, chaos, and strange attractors, volume 41. Springer Science & Business Media, 2012.
- B. Tzen and M. Raginsky. Neural stochastic differential equations: Deep latent gaussian models in the diffusion limit. arXiv preprint arXiv:1905.09883, 2019.
- R. J. Williams and D. Zipser. A learning algorithm for continually running fully recurrent neural networks. Neural computation, 1(2):270–280, 1989.
- Modeling unknown stochastic dynamical system via autoencoder. arXiv preprint arXiv:2312.10001, 2023.
- Onsagernet: Learning stable and interpretable dynamics using a generalized onsager principle. Physical Review Fluids, 6(11):114402, 2021.
- Gfinns: Generic formalism informed neural networks for deterministic and stochastic dynamical systems. Philosophical Transactions of the Royal Society A, 380(2229):20210207, 2022.