Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
133 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Universal Graph Deep Learning Interatomic Potential for the Periodic Table (2202.02450v2)

Published 5 Feb 2022 in cond-mat.mtrl-sci and physics.chem-ph

Abstract: Interatomic potentials (IAPs), which describe the potential energy surface of atoms, are a fundamental input for atomistic simulations. However, existing IAPs are either fitted to narrow chemistries or too inaccurate for general applications. Here, we report a universal IAP for materials based on graph neural networks with three-body interactions (M3GNet). The M3GNet IAP was trained on the massive database of structural relaxations performed by the Materials Project over the past 10 years and has broad applications in structural relaxation, dynamic simulations and property prediction of materials across diverse chemical spaces. About 1.8 million materials were identified from a screening of 31 million hypothetical crystal structures to be potentially stable against existing Materials Project crystals based on M3GNet energies. Of the top 2000 materials with the lowest energies above hull, 1578 were verified to be stable using DFT calculations. These results demonstrate a machine learning-accelerated pathway to the discovery of synthesizable materials with exceptional properties.

Citations (357)

Summary

  • The paper introduces the M3GNet model that leverages graph neural networks with three-body interactions to accurately predict energies, forces, and stresses for diverse materials.
  • It employs a vast dataset from over 1.8 million structures and 187,000 energy evaluations to robustly benchmark against classical potentials and DFT simulations.
  • The model accelerates materials discovery by reliably identifying synthesizable compounds and streamlining high-throughput structural relaxations.

A Universal Graph Deep Learning Interatomic Potential for the Periodic Table

The paper "A Universal Graph Deep Learning Interatomic Potential for the Periodic Table" presents a significant advancement in the modeling of interatomic potentials (IAPs) that could broadly impact materials science research. Traditional IAPs have been limited to narrow chemistries or lacked the accuracy required for general applications. The introduction of the M3GNet model—a universal IAP—utilizes graph neural networks (GNNs) with three-body interactions to accurately predict the potential energy surface of a wide variety of materials across the periodic table. This paper builds upon the extensive data curated from the Materials Project, encompassing over 1.8 million materials derived from 31 million hypothetical structure screenings.

Methodology and Implementation

The M3GNet model captures extensive chemical diversity by integrating many-body features into traditional graph representations of materials. It employs a combination of local environment descriptors, such as interatomic distances and angles, which serve as inputs to graph-based machine learning frameworks. The model leverages a dataset exceeding 187,000 energies and numerous force and stress components obtained from structural relaxations, covering 89 elements. The architecture is designed to handle tensorial quantities such as forces and stresses through automatic differentiation, making it well-suited for applications in dynamic simulations and structural relaxations.

Results and Comparisons

When benchmarked against classical potentials and existing machine learning-based IAPs, M3GNet demonstrated superior versatility and comparable accuracy in predicting energies and forces. This was achieved without the data explosion typically encountered when extending ML-IAPs to multi-element chemistries. For instance, the model achieved a low mean absolute error (MAE) in energy, force, and stress predictions, translating into precise estimations of material properties such as bulk moduli and phonon density of states.

Remarkably, M3GNet-relaxed structures showed close agreement with DFT-relaxed counterparts, thus validating its efficiency and accuracy for high-throughput computational tasks. This capability is particularly important for stability assessments, where energy differences can be crucial.

Implications and Future Directions

The capability to predict the stability of nearly 1.8 million materials expands the potential pool for material discoveries significantly, unearthing previously inaccessible chemical spaces. The M3GNet model efficiently identifies potentially synthesizable materials, identifying materials not currently in the Materials Project database.

Potential applications extend well beyond static property predictions. The model can serve as a surrogate for DFT in molecular dynamics (MD) simulations, enabling the exploration of transport properties and dynamic stability of materials. As such, M3GNet paves the way for rapid materials discovery workflows, where initial structure relaxations are conducted at a fraction of the computational cost of traditional DFT calculations.

The adaptability of the M3GNet framework also presents avenues for further development. Enhancements could include tailored datasets with higher accuracy DFT calculations or active learning strategies that iteratively refine and expand on its training data. Moreover, this framework's potential extends to molecular systems, highlighting its versatility beyond crystalline materials.

In conclusion, the developments presented in this paper present a strategic enhancement to the field of computational materials science. M3GNet's integration of graph-based deep learning with extensive empirical data positions it as a key tool for materials discovery and scientific exploration within vastly diverse chemical spaces.