Papers
Topics
Authors
Recent
Search
2000 character limit reached

Accelerating High-Throughput Phonon Calculations via Machine Learning Universal Potentials

Published 12 Jul 2024 in cond-mat.mtrl-sci | (2407.09674v1)

Abstract: Phonons play a critical role in determining various material properties, but conventional methods for phonon calculations are computationally intensive, limiting their broad applicability. In this study, we present an approach to accelerate high-throughput harmonic phonon calculations using machine learning universal potentials. We train a state-of-the-art machine learning interatomic potential, based on multi-atomic cluster expansion (MACE), on a comprehensive dataset of 2,738 crystal structures with 77 elements, totaling 15,670 supercell structures, computed using high-fidelity density functional theory (DFT) calculations. Our approach significantly reduces the number of required supercells for phonon calculations while maintaining high accuracy in predicting harmonic phonon properties across diverse materials. The trained model is validated against phonon calculations for a held-out subset of 384 materials, achieving a mean absolute error (MAE) of 0.18 THz for vibrational frequencies from full phonon dispersions, 2.19 meV/atom for Helmholtz vibrational free energies at 300K, as well as a classification accuracy of 86.2% for dynamical stability of materials. A thermodynamic analysis of polymorphic stability in 126 systems demonstrates good agreement with DFT results at 300 K and 1000 K. In addition, the diverse and extensive high-quality DFT dataset curated in this study serves as a valuable resource for researchers to train and improve other machine learning interatomic potential models.

Citations (2)

Summary

  • The paper presents a novel MACE model that accelerates DFT phonon calculations using machine learning potentials.
  • The study utilizes advanced MPNNs and a training dataset of 15,670 structures to achieve a mean absolute error of 0.18 THz in vibrational frequency predictions.
  • The approach demonstrates high accuracy in predicting dynamical stability (86.2%) and thermodynamic stability across diverse material compounds.

Accelerating High-Throughput Phonon Calculations via Machine Learning Universal Potentials

Introduction

The study titled "Accelerating High-Throughput Phonon Calculations via Machine Learning Universal Potentials" presents an innovative approach to accelerating phonon calculations through machine learning-based interatomic potentials. By leveraging a Machine Learning Interatomic Potential model (MACE), the study effectively addresses the challenge of high computational costs in Density Functional Theory (DFT) phonon calculations.

Methodology and Model

The authors introduce MACE, a multi-atomic cluster expansion model, to predict the forces within crystal structures efficiently. It utilizes advanced Message-Passing Neural Networks (MPNNs) which are specialized for processing graph-structured data. The MACE model significantly reduces computational workload by predicting phonon properties with a subset of supercell structures, achieving excellent force prediction accuracy. During training, MACE employs force information alone to refine its predictions, using a training dataset of 15,670 structures derived from 2,738 crystal structures. Figure 1

Figure 1: Workflow chart showing the computational processes employed in our study.

Dataset Construction and Training

The dataset constructed for training spans 77 elements and includes compounds of varying complexity. The data emphasizes a broad representation of forces and energies, supporting the generalizability of the model across different materials. Key to the MACE model's performance is its ability to generalize from this dataset, reducing the required number of supercell calculations typically necessary in DFT. Figure 2

Figure 2: The heat map and distribution of elements and dataset properties crucial for model training.

Model Evaluation

The evaluation on a diverse test set of 384 materials demonstrates the MACE model's robustness in predicting phonon frequencies and thermodynamic stability. The mean absolute error (MAE) for vibrational frequencies was 0.18 THz, showcasing the MACE model’s capacity for accurate predictions despite reduced data requirements. Furthermore, the accuracy for dynamical stability predictions reached 86.2%, affirming the model's potential for identifying stable material phases. Figure 3

Figure 3: The MACE model's validation on force prediction, indicating strong correlation with DFT calculated forces.

Figure 4

Figure 4: Evaluation of the MACE model's performance on phonon frequency prediction versus DFT calculations.

Thermodynamic Stability Analysis

The study extends the model's application to the determination of polymorphic thermodynamic stability, leveraging Helmholtz free energy calculations. The analysis covers 126 polymorphs across 49 distinct types, revealing consistent predictions from both MACE and DFT methods. This comparative study has implications for predicting phase stability under varying thermodynamic conditions. Figure 5

Figure 5: Thermodynamic stability of polymorphs within the phonon dataset.

Conclusion

This research highlights the efficacy of machine learning in accelerating computationally expensive DFT calculations for phonon properties. The MACE model, trained on a systematically constructed dataset, proves to be a powerful alternative, mitigating the computational demand traditionally required by DFT in high-throughput materials screening. Future work may include expanding the methodology to incorporate anharmonic effects and extending the dataset to include more complex compounds. The insights garnered have broad implications, potentially impacting fields of material discovery and computational materials science at large.

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Collections

Sign up for free to add this paper to one or more collections.