Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
143 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

CatTSunami: Accelerating Transition State Energy Calculations with Pre-trained Graph Neural Networks (2405.02078v3)

Published 3 May 2024 in cond-mat.mtrl-sci

Abstract: Direct access to transition state energies at low computational cost unlocks the possibility of accelerating catalyst discovery. We show that the top performing graph neural network potential trained on the OC20 dataset, a related but different task, is able to find transition states energetically similar (within 0.1 eV) to density functional theory (DFT) 91% of the time with a 28x speedup. This speaks to the generalizability of the models, having never been explicitly trained on reactions, the machine learned potential approximates the potential energy surface well enough to be performant for this auxiliary task. We introduce the Open Catalyst 2020 Nudged Elastic Band (OC20NEB) dataset, which is made of 932 DFT nudged elastic band calculations, to benchmark machine learned model performance on transition state energies. To demonstrate the efficacy of this approach, we replicated a well-known, large reaction network with 61 intermediates and 174 dissociation reactions at DFT resolution (40 meV). In this case of dense NEB enumeration, we realize even more computational cost savings and used just 12 GPU days of compute, where DFT would have taken 52 GPU years, a 1500x speedup. Similar searches for complete reaction networks could become routine using the approach presented here. Finally, we replicated an ammonia synthesis activity volcano and systematically found lower energy configurations of the transition states and intermediates on six stepped unary surfaces. This scalable approach offers a more complete treatment of configurational space to improve and accelerate catalyst discovery.

Citations (2)

Summary

  • The paper introduces a method using pretrained graph neural networks on the OC20 dataset to predict transition state energies within 0.1 eV in 91% of cases.
  • It achieves a 28x speedup in force evaluations and up to 1500x overall acceleration for exploring reaction networks compared to traditional DFT methods.
  • The approach enables rapid catalyst discovery by dramatically reducing computational costs and time required for large-scale NEB calculations.

CatTSunami: Accelerating Transition State Energy Calculations with Pre-trained Graph Neural Networks

The research paper titled "CatTSunami: Accelerating Transition State Energy Calculations with Pre-trained Graph Neural Networks" introduces an innovative approach to calculating transition state energies using ML, specifically graph neural networks (GNNs). The primary objective of the paper lies in decreasing computational costs typically associated with Density Functional Theory (DFT) calculations, facilitating faster catalyst discovery—a vital step towards sustainable chemical processes in the context of climate change.

Study Overview

The authors propose utilizing a graph neural network potential, trained on the Open Catalyst 2020 (OC20) dataset, to estimate transition state energies at significantly reduced computational expenses. They highlight that their model achieves an impressive success rate, being within 0.1 eV of DFT-calculated energies in 91% of cases, and offers a remarkable 28x speedup. Moreover, they introduce the OC20 Nudged Elastic Band (OC20NEB) dataset, comprising 932 nudged elastic band calculations for benchmarking ML models on these tasks.

A noteworthy aspect of this paper is its application in replicating significant reaction networks at DFT resolutions using substantially less computation—12 GPU days compared to what would have been 52 GPU years with conventional DFT methods, indicating a 1500x acceleration. This demonstrates the capability of ML-based frameworks to explore and analyze extensive reaction networks effectively.

Methodology and Findings

The "CatTSunami" method involves structured neural networks that can abstract and infer transition states despite not being trained explicitly on reaction data. Using the OC20NEB dataset, the researchers validate multiple ML models, with the Equiformer v2 model showing the best performance. Reaction mechanisms of complex networks like CO hydrogenation on Rh (111) surfaces and ammonia synthesis on stepped unary surfaces were studied, demonstrating the method's applicability across different materials and reactions.

Key Numerical Results:

  • Successfully found transition states 91% of the time within 0.1 eV of DFT-calculated energies.
  • Achieved speedups of 28x when considering force evaluations and up to 1500x when generating reaction networks.
  • Explored 19,000 NEB bands in a practical example of CO hydrogenation, revealing the ML approach's ability to discover lower energy pathways effectively.

Implications and Future Work

This paper bridges a significant gap in heterogeneous catalysis by providing a tool that aligns computational research more closely with experimental applications. The ability to perform large-scale NEB calculations quickly means that the nuances of real-world chemistry can be explored without prohibitive computational costs, thus supporting rapid catalyst discovery necessary to address urgent environmental challenges.

In practical terms, the implications for industrial catalysis are significant, as this approach makes it feasible to exhaustively explore possible reaction mechanisms and quickly identify promising catalysts. Moreover, eliminating many assumptions inherent in current screening methods could lead to more reliable experimental predictions.

Speculation on Future Developments:

  • Extending these ML frameworks could facilitate tangible advancements in other domains where transition states are pivotal, such as materials science and molecular electronics.
  • Incorporating alternative mechanisms, such as the dimer or string methods, and continuously refining the ML models using new datasets will enhance the fidelity and applicability of the models across various chemical systems.

Overall, "CatTSunami" represents an evolution in computational catalysis, empowering researchers to explore broader and deeper chemical spaces at unprecedented speeds, thereby catalyzing the discovery process of materials essential for a sustainable future.