Time-changed normalizing flows for accurate SDE modeling (2312.14698v2)
Abstract: The generative paradigm has become increasingly important in machine learning and deep learning models. Among popular generative models are normalizing flows, which enable exact likelihood estimation by transforming a base distribution through diffeomorphic transformations. Extending the normalizing flow framework to handle time-indexed flows gave dynamic normalizing flows, a powerful tool to model time series, stochastic processes, and neural stochastic differential equations (SDEs). In this work, we propose a novel variant of dynamic normalizing flows, a Time Changed Normalizing Flow (TCNF), based on time deformation of a Brownian motion which constitutes a versatile and extensive family of Gaussian processes. This approach enables us to effectively model some SDEs, that cannot be modeled otherwise, including standard ones such as the well-known Ornstein-Uhlenbeck process, and generalizes prior methodologies, leading to improved results and better inference and prediction capability.
- Bernt Oksendal, Stochastic differential equations: an introduction with applications, Springer Science & Business Media, 2013.
- “Generative adversarial nets,” Advances in neurips, vol. 27, 2014.
- “Auto-encoding variational bayes,” arXiv:1312.6114, 2013.
- “Normalizing Flows for Probabilistic Modeling and Inference,” 12 2019.
- “Normalizing Flows: An Introduction and Review of Current Methods,” 8 2019.
- “Generative modeling by estimating gradients of the data distribution,” Advances in neural information processing systems, vol. 32, 2019.
- “Time-series generative adversarial networks,” Advances in neural information processing systems, vol. 32, 2019.
- “Neural sdes as infinite-dimensional gans,” in International conference on machine learning. PMLR, 2021, pp. 5453–5463.
- “Scalable gradients and variational inference for stochastic differential equations,” in Symposium on Advances in Approximate Bayesian Inference. PMLR, 2020, pp. 1–28.
- “Latent sdes on homogeneous spaces,” arXiv preprint arXiv:2306.16248, 2023.
- “Point process flows,” arXiv preprint arXiv:1910.08281, 2019.
- “Intensity-free learning of temporal point processes,” arXiv preprint arXiv:1909.12127, 2019.
- “FFJORD: Free-form Continuous Dynamics for Scalable Reversible Generative Models,” 10 2018.
- “Modeling continuous stochastic processes with dynamic normalizing flows,” Advances in Neural Information Processing Systems, vol. 33, pp. 7805–7815, 2020.
- “Continuous latent process flows,” Advances in Neural Information Processing Systems, vol. 34, pp. 5162–5173, 2021.
- “Neural stochastic differential equations: Deep latent gaussian models in the diffusion limit,” arXiv preprint arXiv:1905.09883, 2019.
- “Neural sde: Stabilizing neural ode networks with stochastic noise,” arXiv:1906.02355, 2019.
- Continuous martingales and Brownian motion, vol. 293, Springer Science & Business Media, 2013.
- “Learning gradients of convex functions with monotone gradient networks,” in ICASSP 2023-2023 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2023, pp. 1–5.
- “Diffusion models: A comprehensive survey of methods and applications,” arXiv preprint arXiv:2209.00796, 2022.
- “G-research crypto forecasting,” https://kaggle.com/competitions/g-research-crypto-forecasting, 2021, Kaggle.
- “Informer: Beyond efficient transformer for long sequence time-series forecasting,” in Proceedings of the AAAI conference on artificial intelligence, 2021, vol. 35, pp. 11106–11115.