Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
8 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Trivialized Momentum Facilitates Diffusion Generative Modeling on Lie Groups (2405.16381v2)

Published 25 May 2024 in cs.LG, cs.AI, and stat.ML

Abstract: The generative modeling of data on manifolds is an important task, for which diffusion models in flat spaces typically need nontrivial adaptations. This article demonstrates how a technique called `trivialization' can transfer the effectiveness of diffusion models in Euclidean spaces to Lie groups. In particular, an auxiliary momentum variable was algorithmically introduced to help transport the position variable between data distribution and a fixed, easy-to-sample distribution. Normally, this would incur further difficulty for manifold data because momentum lives in a space that changes with the position. However, our trivialization technique creates a new momentum variable that stays in a simple fixed vector space. This design, together with a manifold preserving integrator, simplifies implementation and avoids inaccuracies created by approximations such as projections to tangent space and manifold, which were typically used in prior work, hence facilitating generation with high-fidelity and efficiency. The resulting method achieves state-of-the-art performance on protein and RNA torsion angle generation and sophisticated torus datasets. We also, arguably for the first time, tackle the generation of data on high-dimensional Special Orthogonal and Unitary groups, the latter essential for quantum problems. Code is available at https://github.com/yuchen-zhu-zyc/TDM.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (45)
  1. M. S. Albergo and E. Vanden-Eijnden. Building normalizing flows with stochastic interpolants. ICLR, 2023.
  2. B. D. Anderson. Reverse-time diffusion equation models. Stochastic Processes and their Applications, 12(3):313–326, 1982.
  3. Matching normalizing flows and probability paths on manifolds. arXiv preprint arXiv:2207.04711, 2022.
  4. J. Brehmer and K. Cranmer. Flows for simultaneous manifold learning and density estimation. Advances in Neural Information Processing Systems, 33:442–453, 2020.
  5. R. T. Chen and Y. Lipman. Riemannian flow matching on general geometries. arXiv preprint arXiv:2302.03660, 2023.
  6. Neural ordinary differential equations. Advances in neural information processing systems, 31, 2018.
  7. Generative modeling with phase stochastic bridges. arXiv preprint arXiv:2310.07805, 2023.
  8. Riemannian score-based generative modelling. Advances in Neural Information Processing Systems, 35:2406–2422, 2022.
  9. P. Dhariwal and A. Nichol. Diffusion models beat gans on image synthesis. Advances in neural information processing systems, 34:8780–8794, 2021.
  10. M. P. Do Carmo and J. Flaherty Francis. Riemannian geometry, volume 2. Springer, 1992.
  11. Score-based generative modeling with critically-damped langevin diffusion. arXiv preprint arXiv:2112.07068, 2021.
  12. A. Einstein. Über die von der molekularkinetischen theorie der wärme geforderte bewegung von in ruhenden flüssigkeiten suspendierten teilchen. Annalen der physik, 4, 1905.
  13. Reparameterizing distributions on lie groups. In The 22nd International Conference on Artificial Intelligence and Statistics, pages 3244–3253. PMLR, 2019.
  14. L. Falorsi and P. Forré. Neural ordinary differential equations on manifolds. arXiv preprint arXiv:2006.06663, 2020.
  15. Denoising diffusion probabilistic models. Advances in neural information processing systems, 33:6840–6851, 2020.
  16. Riemannian diffusion models. Advances in Neural Information Processing Systems, 35:2750–2761, 2022.
  17. M. F. Hutchinson. A stochastic estimator of the trace of the influence matrix for laplacian smoothing splines. Communications in Statistics-Simulation and Computation, 18(3):1059–1076, 1989.
  18. Elucidating the design space of diffusion-based generative models. Advances in Neural Information Processing Systems, 35:26565–26577, 2022.
  19. A. A. Kirillov. An introduction to Lie groups and Lie algebras, volume 113. Cambridge University Press, 2008.
  20. Poincaré maps for analyzing complex hierarchies in single-cell data. Nature communications, 11(1):2966, 2020.
  21. Lie groups beyond an introduction, volume 140. Springer, 1996.
  22. L. Kong and M. Tao. Convergence of kinetic langevin monte carlo on lie groups. Conference on Learning Theory, 2024.
  23. Flow matching for generative modeling. arXiv preprint arXiv:2210.02747, 2022.
  24. Flow straight and fast: Learning to generate and transfer data with rectified flow. ICLR, 2023.
  25. Neural manifold ordinary differential equations. Advances in Neural Information Processing Systems, 33:17548–17558, 2020.
  26. Scaling riemannian diffusion models. Advances in Neural Information Processing Systems, 36:80291–80305, 2023.
  27. Structure validation by cα𝛼\alphaitalic_α geometry: ϕitalic-ϕ\phiitalic_ϕ, ψ𝜓\psiitalic_ψ and cβ𝛽\betaitalic_β deviation. Proteins: Structure, Function, and Bioinformatics, 50(3):437–450, 2003.
  28. W. Magnus. On the exponential solution of differential equations for a linear operator. Communications on pure and applied mathematics, 7(4):649–673, 1954.
  29. Directional statistics. John Wiley & Sons, 2009.
  30. E. Mathieu and M. Nickel. Riemannian continuous normalizing flows. Advances in Neural Information Processing Systems, 33:2503–2515, 2020.
  31. F. Mezzadri. How to generate random matrices from the classical compact groups. arXiv preprint math-ph/0609050, 2006.
  32. Rna backbone is rotameric. Proceedings of the National Academy of Sciences, 100(24):13904–13909, 2003.
  33. E. Nelson. Dynamical theories of Brownian motion. Princeton University Press, 1967.
  34. Quantum computation and quantum information. Cambridge university press, 2010.
  35. Neural implicit manifold learning for topology-aware density estimation. Transactions on Machine Learning Research, 2023.
  36. Moser flow: Divergence-based generative modeling on manifolds. Advances in Neural Information Processing Systems, 34:17669–17680, 2021.
  37. A smoothed backbone-dependent rotamer library for proteins derived from adaptive kernel density estimates and regressions. Structure, 19(6):844–858, 2011.
  38. A micro lie theory for state estimation in robotics. arXiv preprint arXiv:1812.01537, 2018.
  39. Sliced score matching: A scalable approach to density and score estimation. In Uncertainty in Artificial Intelligence, pages 574–584. PMLR, 2020.
  40. Score-based generative modeling through stochastic differential equations. arXiv preprint arXiv:2011.13456, 2020.
  41. M. Tao and T. Ohsawa. Variational optimization on lie groups, with examples of leading (generalized) eigenvalue problems. International Conference on Artificial Intelligence and Statistics, 2020.
  42. Riemannian diffusion schr\\\backslash\" odinger bridge. arXiv preprint arXiv:2207.03024, 2022.
  43. P. Vincent. A connection between score matching and denoising autoencoders. Neural computation, 23(7):1661–1674, 2011.
  44. S. Weinberg. The quantum theory of fields, volume 2. Cambridge university press, 1995.
  45. Se (3) diffusion model with application to protein backbone generation. arXiv preprint arXiv:2302.02277, 2023.

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com