Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
134 tokens/sec
GPT-4o
10 tokens/sec
Gemini 2.5 Pro Pro
47 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Accelerating Training of MLIPs Through Small-Cell Training (2304.01314v2)

Published 3 Apr 2023 in cond-mat.mtrl-sci

Abstract: While machine-learned interatomic potentials have become a mainstay for modeling materials, designing training sets that lead to robust potentials is challenging. Automated methods, such as active learning and on-the-fly learning, construct reliable training sets, but these processes can be resource-intensive. Current training approaches often use density functional theory (DFT) calculations that have the same cell size as the simulations that the potential is explicitly trained to model. Here, we demonstrate an easy-to-implement small-cell training protocol and use it to model the Zr-H system. This training leads to a potential that accurately predicts known stable Zr-H phases and reproduces the $\alpha$-$\beta$ pure zirconium phase transition in molecular dynamics simulations. Compared to traditional active learning, small-cell training decreased the training time of the $\alpha$-$\beta$ zirconium phase transition by approximately 20 times. The potential describes the phase transition with a degree of accuracy similar to that of the large-cell training method.

Summary

We haven't generated a summary for this paper yet.