Evaluation of the MACE Force Field Architecture: from Medicinal Chemistry to Materials Science (2305.14247v2)
Abstract: The MACE architecture represents the state of the art in the field of machine learning force fields for a variety of in-domain, extrapolation and low-data regime tasks. In this paper, we further evaluate MACE by fitting models for published benchmark datasets. We show that MACE generally outperforms alternatives for a wide range of systems from amorphous carbon, universal materials modelling, and general small molecule organic chemistry to large molecules and liquid water. We demonstrate the capabilities of the model on tasks ranging from constrained geometry optimisation to molecular dynamics simulations and find excellent performance across all tested domains. We show that MACE is very data efficient, and can reproduce experimental molecular vibrational spectra when trained on as few as 50 randomly selected reference configurations. We further demonstrate that the strictly local atom-centered model is sufficient for such tasks even in the case of large molecules and weakly interacting molecular assemblies.
- J. Behler and M. Parrinello, Generalized neural-network representation of high-dimensional potential-energy surfaces, Physical review letters 98, 146401 (2007).
- A. P. Bartók, R. Kondor, and G. Csányi, On representing chemical environments, Physical Review B 87, 184115 (2013).
- A. V. Shapeev, Moment tensor potentials: A class of systematically improvable interatomic potentials, Multiscale Modeling & Simulation 14, 1153 (2016).
- R. Drautz, Atomic cluster expansion for accurate and transferable interatomic potentials, Physical Review B 99, 014104 (2019).
- J. Nigam, S. Pozdnyakov, and M. Ceriotti, Recursive evaluation and iterative contraction of n-body equivariant features, The Journal of Chemical Physics 153, 121101 (2020).
- K. Schütt, O. Unke, and M. Gastegger, Equivariant message passing for the prediction of tensorial properties and molecular spectra, in International Conference on Machine Learning (PMLR, 2021) pp. 9377–9388.
- P. Thölke and G. De Fabritiis, Equivariant transformers for neural network based molecular potentials, in International Conference on Learning Representations (2022).
- M. J. Willatt, F. Musil, and M. Ceriotti, Feature optimization for atomistic machine learning yields a data-driven construction of the periodic table of the elements, Phys. Chem. Chem. Phys. 20, 29661 (2018).
- B. Anderson, T. S. Hy, and R. Kondor, Cormorant: Covariant molecular neural networks, Advances in neural information processing systems 32 (2019).
- E. Wigner, Group theory: and its application to the quantum mechanics of atomic spectra, Vol. 5 (Elsevier, 2012).
- M. Geiger and T. Smidt, e3nn: Euclidean neural networks (2022), arXiv:2207.09453 [cs.LG] .
- S. S. Schoenholz and E. D. Cubuk, Jax m.d. a framework for differentiable physics, in Advances in Neural Information Processing Systems, Vol. 33 (Curran Associates, Inc., 2020).
- G. Simeon and G. De Fabritiis, Tensornet: Cartesian tensor representations for efficient learning of molecular potentials, arXiv preprint arXiv:2306.06482 (2023).
- S.-L. J. Lahey, T. N. Thien Phuc, and C. N. Rowley, Benchmarking force field and the ani neural network potentials for the torsional potential energy surface of biaryl drug fragments, Journal of Chemical Information and Modeling 60, 6258 (2020).
- S. Riniker, Fixed-charge atomistic force fields for molecular dynamics simulations in the condensed phase: an overview, Journal of chemical information and modeling 58, 565 (2018).
- A. S. Christensen and O. A. Von Lilienfeld, On the role of gradients for machine learning of molecular energies and forces, Machine Learning: Science and Technology 1, 045018 (2020).
- Ethanol ir spectrum, https://webbook.nist.gov/cgi/cbook.cgi?ID=C64175&Type=IR-SPEC&Index=2 (1964).
- Z. Chen and Y. Yang, Incorporating nuclear quantum effects in molecular dynamics with a constrained minimized energy surface, The Journal of Physical Chemistry Letters 14, 279 (2023).
- V. L. Deringer and G. Csányi, Machine learning based interatomic potential for amorphous carbon, Physical Review B 95, 094203 (2017).
- N. Artrith, A. Urban, and G. Ceder, Efficient and accurate machine-learning interpolation of atomic energies in compositions with many species, Physical Review B 96, 014112 (2017).
- S. Takamoto, S. Izumi, and J. Li, Teanet: Universal neural network interatomic potential inspired by iterative electronic relaxations, Computational Materials Science 207, 111280 (2022c).
- C. Chen and S. P. Ong, A universal graph deep learning interatomic potential for the periodic table, Nature Computational Science 2, 718 (2022).
- P. J. Huber, Robust estimation of a location parameter, Breakthroughs in statistics: Methodology and distribution , 492 (1992).
- O. Marsalek and T. E. Markland, Quantum dynamics and spectroscopy of ab initio liquid water: The interplay of nuclear and electronic quantum effects, The journal of physical chemistry letters 8, 1545 (2017).
- Y. Zhang, J. Xia, and B. Jiang, Physically motivated recursively embedded atom neural networks: incorporating local completeness and nonlocality, Physical Review Letters 127, 156002 (2021).
- S. Melchionna, G. Ciccotti, and B. Lee Holian, Hoover npt dynamics for systems varying in shape and size, Molecular Physics 78, 533 (1993).
- S. Melchionna, Constrained systems and statistical distribution, Physical Review E 61, 6165 (2000).
- V. G. Satorras, E. Hoogeboom, and M. Welling, E (n) equivariant graph neural networks, in International conference on machine learning (PMLR, 2021) pp. 9323–9332.
- T. Le, F. Noé, and D.-A. Clevert, Equivariant graph attention networks for molecular property prediction, arXiv preprint arXiv:2202.09891 (2022).
- Y.-L. Liao and T. Smidt, Equiformer: Equivariant graph attention transformer for 3d atomistic graphs, in The Eleventh International Conference on Learning Representations (2023).
- F. Bigi, S. N. Pozdnyakov, and M. Ceriotti, Wigner kernels: body-ordered equivariant machine learning without a basis, arXiv preprint arXiv:2303.04124 (2023).
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.