Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
184 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Micro-Macro Consistency in Multiscale Modeling: Score-Based Model Assisted Sampling of Fast/Slow Dynamical Systems (2312.05715v2)

Published 10 Dec 2023 in cs.LG and math.DS

Abstract: A valuable step in the modeling of multiscale dynamical systems in fields such as computational chemistry, biology, materials science and more, is the representative sampling of the phase space over long timescales of interest; this task is not, however, without challenges. For example, the long term behavior of a system with many degrees of freedom often cannot be efficiently computationally explored by direct dynamical simulation; such systems can often become trapped in local free energy minima. In the study of physics-based multi-time-scale dynamical systems, techniques have been developed for enhancing sampling in order to accelerate exploration beyond free energy barriers. On the other hand, in the field of Machine Learning, a generic goal of generative models is to sample from a target density, after training on empirical samples from this density. Score based generative models (SGMs) have demonstrated state-of-the-art capabilities in generating plausible data from target training distributions. Conditional implementations of such generative models have been shown to exhibit significant parallels with long-established -- and physics based -- solutions to enhanced sampling. These physics-based methods can then be enhanced through coupling with the ML generative models, complementing the strengths and mitigating the weaknesses of each technique. In this work, we show that that SGMs can be used in such a coupling framework to improve sampling in multiscale dynamical systems.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (31)
  1. Enhanced sampling methods for molecular dynamics simulations [article v1.0]. Living Journal of Computational Molecular Science, 4(1):1583, Dec. 2022.
  2. Karl Pearson. LIII. On lines and planes of closest fit to systems of points in space, November 1901.
  3. Mark A. Kramer. Nonlinear principal component analysis using autoassociative neural networks. AIChE Journal, 37(2):233–243, 1991.
  4. A global geometric framework for nonlinear dimensionality reduction. Science, 290(5500):2319–2323, 2000.
  5. Nonlinear dimensionality reduction by locally linear embedding. Science, 290(5500):2323–2326, 2000.
  6. Diffusion maps. Applied and Computational Harmonic Analysis, 21(1):5–30, 2006. Special Issue: Diffusion Maps and Wavelets.
  7. Equation-Free, Coarse-Grained Multiscale Computation: Enabling Mocroscopic Simulators to Perform System-Level Analysis. Communications in Mathematical Sciences, 1(4):715 – 762, 2003.
  8. Equation-free multiscale computation: Algorithms and applications. Annual Review of Physical Chemistry, 60(1):321–344, 2009. PMID: 19335220.
  9. Nonphysical sampling distributions in monte carlo free-energy estimation: Umbrella sampling. Journal of Computational Physics, 23(2):187–199, 1977.
  10. Generative Adversarial Networks. arXiv:1406.2661 [cs, stat], June 2014. arXiv: 1406.2661.
  11. Auto-encoding variational bayes, 2013.
  12. M. Mirza and S. Osindero. Conditional Generative Adversarial Nets. arXiv:1411.1784 [cs, stat], November 2014. arXiv: 1411.1784.
  13. Molecular latent space simulators. Chem. Sci., 11:9459–9467, 2020.
  14. Chasing collective variables using autoencoders and biased trajectories. Journal of Chemical Theory and Computation, 18(1):59–78, 2022. PMID: 34965117.
  15. Score-based generative modeling through stochastic differential equations, 2020.
  16. Improved techniques for training score-based generative models, 2020.
  17. Gans and closures: Micro-macro consistency in multiscale modeling. Multiscale Modeling & Simulation, 21(3):1122–1146, 2023.
  18. Gisiro Maruyama. Continuous markov processes and stochastic equations. Rendiconti del Circolo Matematico di Palermo, 4(1):48–90, 1955.
  19. Generative modeling by estimating gradients of the data distribution, 2019.
  20. Pascal Vincent. A connection between score matching and denoising autoencoders. Neural computation, 23(7):1661–1674, 2011.
  21. Csdi: Conditional score-based diffusion models for probabilistic time series imputation, 2021.
  22. Image super-resolution via iterative refinement, 2021.
  23. Conditional image generation with score-based diffusion models, 2021.
  24. Elucidating the design space of diffusion-based generative models, 2022.
  25. The weighted histogram analysis method for free-energy calculations on biomolecules. i. the method. Journal of Computational Chemistry, 13(8):1011–1021, 1992.
  26. Statistically optimal analysis of samples from multiple equilibrium states. The Journal of Chemical Physics, 129(12):124105, 2008.
  27. Openmm 7: Rapid development of high performance algorithms for molecular dynamics. PLOS Computational Biology, 13(7):1–17, 07 2017.
  28. Amber 14, 2014.
  29. Generalized born model with a simple, robust molecular volume correction. Journal of Chemical Theory and Computation, 3(1):156–169, 2007. PMID: 21072141.
  30. Diffdock: Diffusion steps, twists, and turns for molecular docking, 2023.
  31. Score-based generative models for molecule generation, 2022.
Citations (1)

Summary

We haven't generated a summary for this paper yet.