Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 134 tok/s
Gemini 2.5 Pro 41 tok/s Pro
GPT-5 Medium 30 tok/s Pro
GPT-5 High 37 tok/s Pro
GPT-4o 98 tok/s Pro
Kimi K2 195 tok/s Pro
GPT OSS 120B 442 tok/s Pro
Claude Sonnet 4.5 37 tok/s Pro
2000 character limit reached

Evaluation of the MACE Force Field Architecture: from Medicinal Chemistry to Materials Science (2305.14247v2)

Published 23 May 2023 in physics.chem-ph and stat.ML

Abstract: The MACE architecture represents the state of the art in the field of machine learning force fields for a variety of in-domain, extrapolation and low-data regime tasks. In this paper, we further evaluate MACE by fitting models for published benchmark datasets. We show that MACE generally outperforms alternatives for a wide range of systems from amorphous carbon, universal materials modelling, and general small molecule organic chemistry to large molecules and liquid water. We demonstrate the capabilities of the model on tasks ranging from constrained geometry optimisation to molecular dynamics simulations and find excellent performance across all tested domains. We show that MACE is very data efficient, and can reproduce experimental molecular vibrational spectra when trained on as few as 50 randomly selected reference configurations. We further demonstrate that the strictly local atom-centered model is sufficient for such tasks even in the case of large molecules and weakly interacting molecular assemblies.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (31)
  1. J. Behler and M. Parrinello, Generalized neural-network representation of high-dimensional potential-energy surfaces, Physical review letters 98, 146401 (2007).
  2. A. P. Bartók, R. Kondor, and G. Csányi, On representing chemical environments, Physical Review B 87, 184115 (2013).
  3. A. V. Shapeev, Moment tensor potentials: A class of systematically improvable interatomic potentials, Multiscale Modeling & Simulation 14, 1153 (2016).
  4. R. Drautz, Atomic cluster expansion for accurate and transferable interatomic potentials, Physical Review B 99, 014104 (2019).
  5. J. Nigam, S. Pozdnyakov, and M. Ceriotti, Recursive evaluation and iterative contraction of n-body equivariant features, The Journal of Chemical Physics 153, 121101 (2020).
  6. K. Schütt, O. Unke, and M. Gastegger, Equivariant message passing for the prediction of tensorial properties and molecular spectra, in International Conference on Machine Learning (PMLR, 2021) pp. 9377–9388.
  7. P. Thölke and G. De Fabritiis, Equivariant transformers for neural network based molecular potentials, in International Conference on Learning Representations (2022).
  8. M. J. Willatt, F. Musil, and M. Ceriotti, Feature optimization for atomistic machine learning yields a data-driven construction of the periodic table of the elements, Phys. Chem. Chem. Phys. 20, 29661 (2018).
  9. B. Anderson, T. S. Hy, and R. Kondor, Cormorant: Covariant molecular neural networks, Advances in neural information processing systems 32 (2019).
  10. E. Wigner, Group theory: and its application to the quantum mechanics of atomic spectra, Vol. 5 (Elsevier, 2012).
  11. M. Geiger and T. Smidt, e3nn: Euclidean neural networks (2022), arXiv:2207.09453 [cs.LG] .
  12. S. S. Schoenholz and E. D. Cubuk, Jax m.d. a framework for differentiable physics, in Advances in Neural Information Processing Systems, Vol. 33 (Curran Associates, Inc., 2020).
  13. G. Simeon and G. De Fabritiis, Tensornet: Cartesian tensor representations for efficient learning of molecular potentials, arXiv preprint arXiv:2306.06482  (2023).
  14. S.-L. J. Lahey, T. N. Thien Phuc, and C. N. Rowley, Benchmarking force field and the ani neural network potentials for the torsional potential energy surface of biaryl drug fragments, Journal of Chemical Information and Modeling 60, 6258 (2020).
  15. S. Riniker, Fixed-charge atomistic force fields for molecular dynamics simulations in the condensed phase: an overview, Journal of chemical information and modeling 58, 565 (2018).
  16. A. S. Christensen and O. A. Von Lilienfeld, On the role of gradients for machine learning of molecular energies and forces, Machine Learning: Science and Technology 1, 045018 (2020).
  17. Ethanol ir spectrum, https://webbook.nist.gov/cgi/cbook.cgi?ID=C64175&Type=IR-SPEC&Index=2 (1964).
  18. Z. Chen and Y. Yang, Incorporating nuclear quantum effects in molecular dynamics with a constrained minimized energy surface, The Journal of Physical Chemistry Letters 14, 279 (2023).
  19. V. L. Deringer and G. Csányi, Machine learning based interatomic potential for amorphous carbon, Physical Review B 95, 094203 (2017).
  20. N. Artrith, A. Urban, and G. Ceder, Efficient and accurate machine-learning interpolation of atomic energies in compositions with many species, Physical Review B 96, 014112 (2017).
  21. S. Takamoto, S. Izumi, and J. Li, Teanet: Universal neural network interatomic potential inspired by iterative electronic relaxations, Computational Materials Science 207, 111280 (2022c).
  22. C. Chen and S. P. Ong, A universal graph deep learning interatomic potential for the periodic table, Nature Computational Science 2, 718 (2022).
  23. P. J. Huber, Robust estimation of a location parameter, Breakthroughs in statistics: Methodology and distribution , 492 (1992).
  24. O. Marsalek and T. E. Markland, Quantum dynamics and spectroscopy of ab initio liquid water: The interplay of nuclear and electronic quantum effects, The journal of physical chemistry letters 8, 1545 (2017).
  25. Y. Zhang, J. Xia, and B. Jiang, Physically motivated recursively embedded atom neural networks: incorporating local completeness and nonlocality, Physical Review Letters 127, 156002 (2021).
  26. S. Melchionna, G. Ciccotti, and B. Lee Holian, Hoover npt dynamics for systems varying in shape and size, Molecular Physics 78, 533 (1993).
  27. S. Melchionna, Constrained systems and statistical distribution, Physical Review E 61, 6165 (2000).
  28. V. G. Satorras, E. Hoogeboom, and M. Welling, E (n) equivariant graph neural networks, in International conference on machine learning (PMLR, 2021) pp. 9323–9332.
  29. T. Le, F. Noé, and D.-A. Clevert, Equivariant graph attention networks for molecular property prediction, arXiv preprint arXiv:2202.09891  (2022).
  30. Y.-L. Liao and T. Smidt, Equiformer: Equivariant graph attention transformer for 3d atomistic graphs, in The Eleventh International Conference on Learning Representations (2023).
  31. F. Bigi, S. N. Pozdnyakov, and M. Ceriotti, Wigner kernels: body-ordered equivariant machine learning without a basis, arXiv preprint arXiv:2303.04124  (2023).
Citations (52)

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.