Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Quantum Hardware-Enabled Molecular Dynamics via Transfer Learning (2406.08554v1)

Published 12 Jun 2024 in physics.chem-ph, cond-mat.stat-mech, and quant-ph

Abstract: The ability to perform ab initio molecular dynamics simulations using potential energies calculated on quantum computers would allow virtually exact dynamics for chemical and biochemical systems, with substantial impacts on the fields of catalysis and biophysics. However, noisy hardware, the costs of computing gradients, and the number of qubits required to simulate large systems present major challenges to realizing the potential of dynamical simulations using quantum hardware. Here, we demonstrate that some of these issues can be mitigated by recent advances in machine learning. By combining transfer learning with techniques for building machine-learned potential energy surfaces, we propose a new path forward for molecular dynamics simulations on quantum hardware. We use transfer learning to reduce the number of energy evaluations that use quantum hardware by first training models on larger, less accurate classical datasets and then refining them on smaller, more accurate quantum datasets. We demonstrate this approach by training machine learning models to predict a molecule's potential energy using Behler-Parrinello neural networks. When successfully trained, the model enables energy gradient predictions necessary for dynamics simulations that cannot be readily obtained directly from quantum hardware. To reduce the quantum resources needed, the model is initially trained with data derived from low-cost techniques, such as Density Functional Theory, and subsequently refined with a smaller dataset obtained from the optimization of the Unitary Coupled Cluster ansatz. We show that this approach significantly reduces the size of the quantum training dataset while capturing the high accuracies needed for quantum chemistry simulations.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (11)
  1. Abid Khan (18 papers)
  2. Prateek Vaish (1 paper)
  3. Yaoqi Pang (1 paper)
  4. Nikhil Kowshik (1 paper)
  5. Michael S. Chen (9 papers)
  6. Clay H. Batton (4 papers)
  7. Grant M. Rotskoff (41 papers)
  8. J. Wayne Mullinax (7 papers)
  9. Bryan K. Clark (76 papers)
  10. Brenda M. Rubenstein (22 papers)
  11. Norm M. Tubman (53 papers)
Citations (2)

Summary

We haven't generated a summary for this paper yet.