Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
173 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Understanding and improving transferability in machine-learned activation energy predictors (2505.00604v1)

Published 1 May 2025 in physics.chem-ph

Abstract: The calculation of reactive properties is a challenging task in chemical reaction discovery. Machine learning (ML) methods play an important role in accelerating electronic structure predictions of activation energies and reaction enthalpies, and are a crucial ingredient to enable large-scale automated reaction network discovery with $>103$ reactions. Unfortunately, the predictive accuracy of existing ML models does not yet reach the required accuracy across the space of possible chemical reactions to enable subsequent kinetic simulations that even qualitatively agree with experimental kinetics. Here, we comprehensively assess the underlying reasons for prediction failures within a selection of machine-learned models of reactivity. Models based on difference fingerprints between reactant and product structures lack transferability despite providing good in-distribution predictions. This results in a significant loss of information about the context and mechanism of chemical reactions. We propose a convolutional ML model that uses atom-centered quantum-chemical descriptors and approximate transition state information. Inclusion of the latter improves transferability for out-of-distribution benchmark reactions, making greater use of the limited chemical reaction space spanned by the training data. The model further delivers atom-level contributions to activation energies and reaction enthalpies that provide a useful interpretational tool for rationalizing reactivity.

Summary

We haven't generated a summary for this paper yet.