Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
125 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Time-changed normalizing flows for accurate SDE modeling (2312.14698v2)

Published 22 Dec 2023 in cs.LG and stat.ML

Abstract: The generative paradigm has become increasingly important in machine learning and deep learning models. Among popular generative models are normalizing flows, which enable exact likelihood estimation by transforming a base distribution through diffeomorphic transformations. Extending the normalizing flow framework to handle time-indexed flows gave dynamic normalizing flows, a powerful tool to model time series, stochastic processes, and neural stochastic differential equations (SDEs). In this work, we propose a novel variant of dynamic normalizing flows, a Time Changed Normalizing Flow (TCNF), based on time deformation of a Brownian motion which constitutes a versatile and extensive family of Gaussian processes. This approach enables us to effectively model some SDEs, that cannot be modeled otherwise, including standard ones such as the well-known Ornstein-Uhlenbeck process, and generalizes prior methodologies, leading to improved results and better inference and prediction capability.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (22)
  1. Bernt Oksendal, Stochastic differential equations: an introduction with applications, Springer Science & Business Media, 2013.
  2. “Generative adversarial nets,” Advances in neurips, vol. 27, 2014.
  3. “Auto-encoding variational bayes,” arXiv:1312.6114, 2013.
  4. “Normalizing Flows for Probabilistic Modeling and Inference,” 12 2019.
  5. “Normalizing Flows: An Introduction and Review of Current Methods,” 8 2019.
  6. “Generative modeling by estimating gradients of the data distribution,” Advances in neural information processing systems, vol. 32, 2019.
  7. “Time-series generative adversarial networks,” Advances in neural information processing systems, vol. 32, 2019.
  8. “Neural sdes as infinite-dimensional gans,” in International conference on machine learning. PMLR, 2021, pp. 5453–5463.
  9. “Scalable gradients and variational inference for stochastic differential equations,” in Symposium on Advances in Approximate Bayesian Inference. PMLR, 2020, pp. 1–28.
  10. “Latent sdes on homogeneous spaces,” arXiv preprint arXiv:2306.16248, 2023.
  11. “Point process flows,” arXiv preprint arXiv:1910.08281, 2019.
  12. “Intensity-free learning of temporal point processes,” arXiv preprint arXiv:1909.12127, 2019.
  13. “FFJORD: Free-form Continuous Dynamics for Scalable Reversible Generative Models,” 10 2018.
  14. “Modeling continuous stochastic processes with dynamic normalizing flows,” Advances in Neural Information Processing Systems, vol. 33, pp. 7805–7815, 2020.
  15. “Continuous latent process flows,” Advances in Neural Information Processing Systems, vol. 34, pp. 5162–5173, 2021.
  16. “Neural stochastic differential equations: Deep latent gaussian models in the diffusion limit,” arXiv preprint arXiv:1905.09883, 2019.
  17. “Neural sde: Stabilizing neural ode networks with stochastic noise,” arXiv:1906.02355, 2019.
  18. Continuous martingales and Brownian motion, vol. 293, Springer Science & Business Media, 2013.
  19. “Learning gradients of convex functions with monotone gradient networks,” in ICASSP 2023-2023 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2023, pp. 1–5.
  20. “Diffusion models: A comprehensive survey of methods and applications,” arXiv preprint arXiv:2209.00796, 2022.
  21. “G-research crypto forecasting,” https://kaggle.com/competitions/g-research-crypto-forecasting, 2021, Kaggle.
  22. “Informer: Beyond efficient transformer for long sequence time-series forecasting,” in Proceedings of the AAAI conference on artificial intelligence, 2021, vol. 35, pp. 11106–11115.

Summary

We haven't generated a summary for this paper yet.