Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

GANs and Closures: Micro-Macro Consistency in Multiscale Modeling (2208.10715v4)

Published 23 Aug 2022 in cs.LG, math.DS, and physics.chem-ph

Abstract: Sampling the phase space of molecular systems -- and, more generally, of complex systems effectively modeled by stochastic differential equations -- is a crucial modeling step in many fields, from protein folding to materials discovery. These problems are often multiscale in nature: they can be described in terms of low-dimensional effective free energy surfaces parametrized by a small number of "slow" reaction coordinates; the remaining "fast" degrees of freedom populate an equilibrium measure on the reaction coordinate values. Sampling procedures for such problems are used to estimate effective free energy differences as well as ensemble averages with respect to the conditional equilibrium distributions; these latter averages lead to closures for effective reduced dynamic models. Over the years, enhanced sampling techniques coupled with molecular simulation have been developed. An intriguing analogy arises with the field of Machine Learning (ML), where Generative Adversarial Networks can produce high dimensional samples from low dimensional probability distributions. This sample generation returns plausible high dimensional space realizations of a model state, from information about its low-dimensional representation. In this work, we present an approach that couples physics-based simulations and biasing methods for sampling conditional distributions with ML-based conditional generative adversarial networks for the same task. The "coarse descriptors" on which we condition the fine scale realizations can either be known a priori, or learned through nonlinear dimensionality reduction. We suggest that this may bring out the best features of both approaches: we demonstrate that a framework that couples cGANs with physics-based enhanced sampling techniques can improve multiscale SDE dynamical systems sampling, and even shows promise for systems of increasing complexity.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (55)
  1. K Pearson. On lines and planes of closest fit to systems of points in space. London Edinburgh Dublin Philos. Mag. J. Sci., pages 559–572, 1901.
  2. M. A. Kramer. Nonlinear principal component analysis using autoassociative neural networks. AIChE, 37:233–243, 1991.
  3. A global geometric framework for nonlinear dimensionality reduction. Science, 290:2319–2323, 2000.
  4. Nonlinear dimensionality reduction by locally linear embedding. Science, 290(5500):2323–2326, 2000.
  5. R. R. Coifman and S. Lafon. Diffusion maps. Appl. Comput. Harmon. Anal., 21:5–30, 2006.
  6. Nonphysical sampling distributions in monte carlo free-energy estimation: Umbrella sampling. J. Comput. Phys., 23:187–199, 1977.
  7. Generative Adversarial Networks. http://arxiv.org/abs/1406.2661, 2014.
  8. Score-based generative modeling through stochastic differential equations. https://arxiv.org/abs/2011.13456, 2020.
  9. Y. Song and S. Ermon. Improved techniques for training score-based generative models. https://arxiv.org/abs/2006.09011, 2020.
  10. P. Dhariwal and A. Nichol. Diffusion models beat gans on image synthesis. https://arxiv.org/abs/2105.05233, 2021.
  11. C. Soize and R. Ghanem. Data-driven probability concentration and sampling on manifold. J. Comput. Phys., 321:242–258, 2016.
  12. C. Soize and R. Ghanem. Probabilistic learning on manifolds. Found. of Data Sci., 2(3):279–307, 2020.
  13. A style-based generator architecture for generative adversarial networks. https://arxiv.org/abs/1812.04948, 2019.
  14. M. Mirza and S. Osindero. Conditional Generative Adversarial Nets. http://arxiv.org/abs/1411.1784, 2014.
  15. R. W. Zwanzig. High‐temperature equation of state by a perturbation method. i. nonpolar gases. J. Chem. Phys., 22:1420–1426, 1954.
  16. Adaptive biasing force method for scalar and vector free energy calculations. J. Chem. Phys., 128:144120, 2008.
  17. A. Laio and M. Parrinello. Escaping free-energy minima. Proc. Natl. Acad. Sci. USA, 99:12562–12566, 2002.
  18. Intrinsic map dynamics exploration for uncharted effective free-energy landscapes. Proc. Natl. Acad. Sci. USA, 114:E5494–E5503, 2017.
  19. Chasing collective variables using autoencoders and biased trajectories. J. of Chem. Theory Comput., 18:59–78, 2022. PMID: 34965117.
  20. Molecular latent space simulators. Chem. Sci., 11:9459–9467, 2020.
  21. The weighted histogram analysis method for free-energy calculations on biomolecules. i. the method. J. Comput. Chem., 13:1011–1021, 1992.
  22. Statistically optimal analysis of samples from multiple equilibrium states. J. Chem. Phys., 129:124105, 2008.
  23. R. R. Coifman and S. Lafon. Geometric harmonics: A novel tool for multiscale out-of-sample extension of empirical functions. Appl. Comput. Harmon. Anal., 21:31–52, 2006.
  24. On the correspondence between gaussian processes and geometric harmonics. https://arxiv.org/abs/2110.02296, 2021.
  25. Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning). The MIT Press, Cambridge, MA, 2005.
  26. J. Wang and A. L. Ferguson. Nonlinear machine learning in simulations of soft and biological materials. Mol. Simulat., 44:1090–1107, 2018.
  27. M. Kingma, D. P.and Welling. Auto-encoding variational bayes. https://arxiv.org/abs/1312.6114, 2013.
  28. Rapid exploration of configuration space with diffusion-map-directed molecular dynamics. J. Phys. Chem. B, 117:12769–12776, 2013. PMID: 23865517.
  29. Coarse-grained dynamics of an activity bump in a neural field model. Nonlinearity, 20:2127–2146, jul 2007.
  30. Rational design of patchy colloids via landscape engineering. Mol. Syst. Des. Eng., 3:49–65, 2018.
  31. Wasserstein generative adversarial networks. In Doina Precup and Yee Whye Teh, editors, Proceedings of the 34th International Conference on Machine Learning, volume 70 of Proc. Mach. Learn. Res., pages 214–223. PMLR, 2017.
  32. Cc{gan}: Continuous conditional generative adversarial networks for image generation. In International Conference on Learning Representations, 2021.
  33. P. C. Mahalanobis. On the generalized distance in statistics, 1936.
  34. A. Singer and R. R. Coifman. Non-linear independent component analysis with diffusion maps. Appl. Comput. Harmon. Anal., 25:226–239, 2008.
  35. Data-driven reduction for a class of multiscale fast-slow stochastic dynamical systems. SIAM J. on Appl. Dyn. Syst., 15:1327–1351, 2016.
  36. K. Itô. Stochastic integral. Proc. Imperial Acad., 20:519 – 524, 1944.
  37. Accelerated simulations of molecular systems through learning of effective dynamics. J. Chem. Theory Comput., 18:538–549, 2022. PMID: 34890204.
  38. A systematic study of minima in alanine dipeptide. J. Comput. Chem., 40:297–309, 2019.
  39. W. Kabsch. A solution for the best rotation to relate two sets of vectors. Acta Crystallogr. A, 32:922–923, 1976.
  40. Equation-Free, Coarse-Grained Multiscale Computation: Enabling Mocroscopic Simulators to Perform System-Level Analysis. Commmun. Math. Sci., 1:715 – 762, 2003.
  41. An exploration algorithm for stochastic simulators driven by energy gradients. Entropy, 19:294, 2017.
  42. R. Z. Khas’minskii. The averaging principle for stochastic differential equations. Probl. Peredachi Inf., 4:86–87, 1968.
  43. Random Perturbations of Dynamical Systems, volume 260 of Grundlehren math. Wiss. Springer Berlin Heidelberg, Berlin, Heidelberg, 2012.
  44. E. Vanden-Eijnden. Numerical Techniques for Multi-Scale Dynamical Systems with Stochastic Effects. Commun. Math. Sci., 1:385–391, 2003.
  45. F. Legoll and T. Lelièvre. Effective dynamics using conditional expectations. Nonlinearity, 23:2131–2163, 2010. Publisher: IOP Publishing.
  46. Coarse Graining of nonreversible stochastic differential equations: quantitative results and connections to averaging. SIAM J. Math. Anal., 52:2689–2733, 2020. Publisher: Society for Industrial and Applied Mathematics.
  47. Projecting to a slow manifold: Singularly perturbed systems and legacy codes. SIAM J. Appl. Dyn. Syst., 4:711–732, 2005.
  48. A common approach to the computation of coarse-scale steady states and to consistent initialization on a slow manifold. Comput. and Chem. Eng., 35:1949–1958, 2011.
  49. Gisiro Maruyama. Continuous markov processes and stochastic equations. Rendiconti del Circolo Matematico di Palermo, 4:48–90, 1955.
  50. Openmm 7: Rapid development of high performance algorithms for molecular dynamics. PLOS Comput. Biol., 13(7):1–17, 2017.
  51. Amber 14. University of California, San Francisco, CA, 2014.
  52. Generalized born model with a simple, robust molecular volume correction. J. Chem. Theory Comput., 3:156–169, 2007. PMID: 21072141.
  53. The PLUMED Consortium. Promoting transparency and reproducibility in enhanced molecular simulations. Nature methods, 16:670–673, 2019.
  54. Plumed 2: New feathers for an old bird. Comput. Phys. Commun., 185:604–613, 2014.
  55. Plumed: A portable plugin for free-energy calculations with molecular dynamics. Comput. Phys. Commun., 180:1961–1972, 2009.
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Ellis R. Crabtree (3 papers)
  2. Juan M. Bello-Rivas (17 papers)
  3. Andrew L. Ferguson (17 papers)
  4. Ioannis G. Kevrekidis (116 papers)
Citations (3)

Summary

We haven't generated a summary for this paper yet.