Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
139 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Space-Time Diffusion Bridge (2402.08847v2)

Published 13 Feb 2024 in stat.ML and cs.LG

Abstract: In this study, we introduce a novel method for generating new synthetic samples that are independent and identically distributed (i.i.d.) from high-dimensional real-valued probability distributions, as defined implicitly by a set of Ground Truth (GT) samples. Central to our method is the integration of space-time mixing strategies that extend across temporal and spatial dimensions. Our methodology is underpinned by three interrelated stochastic processes designed to enable optimal transport from an easily tractable initial probability distribution to the target distribution represented by the GT samples: (a) linear processes incorporating space-time mixing that yield Gaussian conditional probability densities, (b) their diffusion bridge analogs that are conditioned to the initial and final state vectors, and (c) nonlinear stochastic processes refined through score-matching techniques. The crux of our training regime involves fine-tuning the nonlinear model, and potentially the linear models -- to align closely with the GT data. We validate the efficacy of our space-time diffusion approach with numerical experiments, laying the groundwork for more extensive future theory and experiments to fully authenticate the method, particularly providing a more efficient (possibly simulation-free) inference.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (16)
  1. Anderson, B.D. (1982). Reverse-time diffusion equation models. Stochastic Processes and their Applications, 12(3), 313–326. 10.1016/0304-4149(82)90051-5. URL https://linkinghub.elsevier.com/retrieve/pii/0304414982900515.
  2. Representations of multidimensional linear process bridges. URL http://arxiv.org/abs/1011.0067. ArXiv:1011.0067 [math].
  3. Denoising Diffusion Probabilistic Models. URL http://arxiv.org/abs/2006.11239. ArXiv:2006.11239 [cs, stat].
  4. Hyvärinen, A. (2005). Estimation of Non-Normalized Statistical Models by Score Matching. Journal of Machine Learning Research, 6(24), 695–709. URL http://jmlr.org/papers/v6/hyvarinen05a.html.
  5. Auto-Encoding Variational Bayes. URL http://arxiv.org/abs/1312.6114. ArXiv:1312.6114 [cs, stat].
  6. Bbdm: Image-to-image translation with brownian bridge diffusion models.
  7. Estimating high order gradients of the data distribution by denoising. In A. Beygelzimer, Y. Dauphin, P. Liang, and J.W. Vaughan (eds.), Advances in Neural Information Processing Systems. URL https://openreview.net/forum?id=YTkQQrqSyE1.
  8. Peluchetti, S. (2021). Non-Denoising Forward-Time Diffusions. openreview.net. URL https://openreview.net/forum?id=oVfIKuhqfC.
  9. Diffusions, Markov Processes and Martingales. Cambridge University Press, 2 edition. 10.1017/CBO9780511805141. URL https://www.cambridge.org/core/product/identifier/9780511805141/type/book.
  10. High-resolution image synthesis with latent diffusion models.
  11. Deep Unsupervised Learning using Nonequilibrium Thermodynamics. URL http://arxiv.org/abs/1503.03585. ArXiv:1503.03585 [cond-mat, q-bio, stat].
  12. Score-Based Generative Modeling through Stochastic Differential Equations. URL http://arxiv.org/abs/2011.13456. ArXiv:2011.13456 [cs, stat].
  13. Applied Stochastic Differential Equations. Cambridge University Press, 1 edition. 10.1017/9781108186735. URL https://www.cambridge.org/core/product/identifier/9781108186735/type/book.
  14. Vincent, P. (2011). A Connection Between Score Matching and Denoising Autoencoders. Neural Computation, 23(7), 1661–1674. 10.1162/NECO_a_00142. URL https://direct.mit.edu/neco/article/23/7/1661-1674/7677.
  15. Diffusion Models: A Comprehensive Survey of Methods and Applications. URL http://arxiv.org/abs/2209.00796. ArXiv:2209.00796 [cs].
  16. Denoising diffusion bridge models.
Citations (1)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com