Cluster expansion by transfer learning for phase stability predictions (2311.06179v4)
Abstract: Recent progress towards universal machine-learned interatomic potentials holds considerable promise for materials discovery. Yet the accuracy of these potentials for predicting phase stability may still be limited. In contrast, cluster expansions provide accurate phase stability predictions but are computationally demanding to parameterize from first principles, especially for structures of low dimension or with a large number of components, such as interfaces or multimetal catalysts. We overcome this trade-off via transfer learning. Using Bayesian inference, we incorporate prior statistical knowledge from machine-learned and physics-based potentials, enabling us to sample the most informative configurations and to efficiently fit first-principles cluster expansions. This algorithm is tested on Pt:Ni, showing robust convergence of the mixing energies as a function of sample size with reduced statistical fluctuations.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.