Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
139 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Data-Efficient Construction of High-Fidelity Graph Deep Learning Interatomic Potentials (2409.00957v1)

Published 2 Sep 2024 in physics.comp-ph, cond-mat.mtrl-sci, and physics.chem-ph

Abstract: Machine learning potentials (MLPs) have become an indispensable tool in large-scale atomistic simulations because of their ability to reproduce ab initio potential energy surfaces (PESs) very accurately at a fraction of computational cost. For computational efficiency, the training data for most MLPs today are computed using relatively cheap density functional theory (DFT) methods such as the Perdew-Burke-Ernzerhof (PBE) generalized gradient approximation (GGA) functional. Meta-GGAs such as the recently developed strongly constrained and appropriately normed (SCAN) functional have been shown to yield significantly improved descriptions of atomic interactions for diversely bonded systems, but their higher computational cost remains an impediment to their use in MLP development. In this work, we outline a data-efficient multi-fidelity approach to constructing Materials 3-body Graph Network (M3GNet) interatomic potentials that integrate different levels of theory within a single model. Using silicon and water as examples, we show that a multi-fidelity M3GNet model trained on a combined dataset of low-fidelity GGA calculations with 10% of high-fidelity SCAN calculations can achieve accuracies comparable to a single-fidelity M3GNet model trained on a dataset comprising 8x the number of SCAN calculations. This work paves the way for the development of high-fidelity MLPs in a cost-effective manner by leveraging existing low-fidelity datasets.

Citations (1)

Summary

We haven't generated a summary for this paper yet.