Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
134 tokens/sec
GPT-4o
10 tokens/sec
Gemini 2.5 Pro Pro
47 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Learning Small Molecule Energies and Interatomic Forces with an Equivariant Transformer on the ANI-1x Dataset (2201.00802v1)

Published 3 Jan 2022 in physics.chem-ph

Abstract: Accurate predictions of interatomic energies and forces are essential for high quality molecular dynamic simulations (MD). Machine learning algorithms can be used to overcome limitations of classical MD by predicting ab initio quality energies and forces. SE(3)-equivariant neural network allow reasoning over spatial relationships and exploiting the rotational and translational symmetries. One such algorithm is the SE(3)-Transformer, which we adapt for the ANI-1x dataset. Our early experimental results indicate through ablation studies that deeper networks - with additional SE(3)-Transformer layers - could reach necessary accuracies to allow effective integration with MD. However, faster implementations of the SE(3)-Transformer will be required, such as the recently published accelerated version by Milesi.

Summary

We haven't generated a summary for this paper yet.