Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Coherent energy and force uncertainty in deep learning force fields (2312.04174v1)

Published 7 Dec 2023 in stat.ML, cs.LG, and physics.comp-ph

Abstract: In machine learning energy potentials for atomic systems, forces are commonly obtained as the negative derivative of the energy function with respect to atomic positions. To quantify aleatoric uncertainty in the predicted energies, a widely used modeling approach involves predicting both a mean and variance for each energy value. However, this model is not differentiable under the usual white noise assumption, so energy uncertainty does not naturally translate to force uncertainty. In this work we propose a machine learning potential energy model in which energy and force aleatoric uncertainty are linked through a spatially correlated noise process. We demonstrate our approach on an equivariant messages passing neural network potential trained on energies and forces on two out-of-equilibrium molecular datasets. Furthermore, we also show how to obtain epistemic uncertainties in this setting based on a Bayesian interpretation of deep ensemble models.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (16)
  1. “Calibrated uncertainty for molecular property prediction using ensembles of message passing neural networks” In Machine Learning: Science and Technology 3.1 IOP Publishing, 2021, pp. 015012 DOI: 10.1088/2632-2153/ac3eb3
  2. “Graph Neural Network Interatomic Potential Ensembles with Calibrated Aleatoric and Epistemic Uncertainty on Energy and Forces” In Physical Chemistry Chemical Physics 25.37 Royal Society of Chemistry, 2023, pp. 25828–25837 DOI: 10.1039/D3CP02143B
  3. “Deep Ensembles vs. Committees for Uncertainty Estimation in Neural-Network Force Fields: Comparison and Application to Active Learning” arXiv, 2023 DOI: 10.48550/arXiv.2302.08805
  4. “Fast and Uncertainty-Aware Directional Message Passing for Non-Equilibrium Molecules” arXiv, 2022 arXiv:arXiv:2011.14115
  5. F.K. Gustafsson, M. Danelljan and T.B. Schon “Evaluating Scalable Bayesian Deep Learning Methods for Robust Computer Vision” In 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW) Los Alamitos, CA, USA: IEEE Computer Society, 2020, pp. 1289–1298 DOI: 10.1109/CVPRW50498.2020.00167
  6. “Deep Ensembles from a Bayesian Perspective” arXiv, 2021 DOI: 10.48550/arXiv.2105.13283
  7. Diederik P. Kingma and Jimmy Ba “Adam: A Method for Stochastic Optimization” arXiv, 2017 DOI: 10.48550/arXiv.1412.6980
  8. Balaji Lakshminarayanan, Alexander Pritzel and Charles Blundell “Simple and scalable predictive uncertainty estimation using deep ensembles” In Advances in neural information processing systems 30, 2017, pp. 6402–6413
  9. “Evaluating and Calibrating Uncertainty Prediction in Regression Tasks” In Sensors 22.15, 2022 DOI: 10.3390/s22155540
  10. Pascal Pernot “Prediction uncertainty validation for computational chemists” In The Journal of Chemical Physics 157.14, 2022, pp. 144103 DOI: 10.1063/5.0109572
  11. “Transition1x - a dataset for building generalizable reactive machine learning potentials” In Scientific Data 9.1, 2022, pp. 779 DOI: 10.1038/s41597-022-01870-w
  12. Kristof Schütt, Oliver Unke and Michael Gastegger “Equivariant message passing for the prediction of tensorial properties and molecular spectra” In International Conference on Machine Learning, 2021, pp. 9377–9388 PMLR
  13. “On the Pitfalls of Heteroscedastic Uncertainty Estimation with Probabilistic Neural Networks” In International Conference on Learning Representations, 2022 URL: https://openreview.net/forum?id=aPOpXlnV1T
  14. Nicki Skafte, Martin Jørgensen and Søren Hauberg “Reliable Training and Estimation of Variance Networks” In Advances in Neural Information Processing Systems 32 Curran Associates, Inc., 2019
  15. “Less is more: Sampling chemical space with active learning” In The Journal of Chemical Physics 148.24, 2018, pp. 241733 DOI: 10.1063/1.5023802
  16. Andrew Gordon Wilson and Pavel Izmailov “Bayesian Deep Learning and a Probabilistic Perspective of Generalization” In Proceedings of the 34th International Conference on Neural Information Processing Systems, NIPS’20 Red Hook, NY, USA: Curran Associates Inc., 2020, pp. 4697–4708
Citations (3)

Summary

We haven't generated a summary for this paper yet.