Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
173 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Deep Coarse-grained Potentials via Relative Entropy Minimization (2208.10330v2)

Published 22 Aug 2022 in physics.chem-ph and physics.comp-ph

Abstract: Neural network (NN) potentials are a natural choice for coarse-grained (CG) models. Their many-body capacity allows highly accurate approximations of the potential of mean force, promising CG simulations at unprecedented accuracy. CG NN potentials trained bottom-up via force matching (FM), however, suffer from finite data effects: They rely on prior potentials for physically sound predictions outside the training data domain and the corresponding free energy surface is sensitive to errors in transition regions. The standard alternative to FM for classical potentials is relative entropy (RE) minimization, which has not yet been applied to NN potentials. In this work, we demonstrate for benchmark problems of liquid water and alanine dipeptide that RE training is more data efficient due to accessing the CG distribution during training, resulting in improved free energy surfaces and reduced sensitivity to prior potentials. In addition, RE learns to correct time integration errors, allowing larger time steps in CG molecular dynamics simulation while maintaining accuracy. Thus, our findings support the use of training objectives beyond FM as a promising direction for improving CG NN potential accuracy and reliability.

Summary

We haven't generated a summary for this paper yet.