Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
Gemini 2.5 Pro
GPT-5
GPT-4o
DeepSeek R1 via Azure
2000 character limit reached

Heterogeneous Ensemble Enables a Universal Uncertainty Metric for Atomistic Foundation Models (2507.21297v1)

Published 28 Jul 2025 in cond-mat.mtrl-sci

Abstract: Universal machine learning interatomic potentials (uMLIPs) are reshaping atomistic simulation as foundation models, delivering near \textit{ab initio} accuracy at a fraction of the cost. Yet the lack of reliable, general uncertainty quantification limits their safe, wide-scale use. Here we introduce a unified, scalable uncertainty metric (U) based on a heterogeneous model ensemble with reuse of pretrained uMLIPs. Across chemically and structurally diverse datasets, (U) shows a strong correlation with the true prediction errors and provides a robust ranking of configuration-level risk. Leveraging this metric, we propose an uncertainty-aware model distillation framework to produce system-specific potentials: for W, an accuracy comparable to full-DFT training is achieved using only (4\%) of the DFT labels; for MoNbTaW, no additional DFT calculations are required. Notably, by filtering numerical label noise, the distilled models can, in some cases, surpass the accuracy of the DFT reference labels. The uncertainty-aware approach offers a practical monitor of uMLIP reliability in deployment, and guides data selection and fine-tuning strategies, thereby advancing the construction and safe use of foundation models and enabling cost-efficient development of accurate, system-specific potentials.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.