Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Extending the reach of quantum computing for materials science with machine learning potentials (2203.07219v1)

Published 14 Mar 2022 in quant-ph, cond-mat.mtrl-sci, and physics.chem-ph

Abstract: Solving electronic structure problems represents a promising field of application for quantum computers. Currently, much effort has been spent in devising and optimizing quantum algorithms for quantum chemistry problems featuring up to hundreds of electrons. While quantum algorithms can in principle outperform their classical equivalents, the polynomially scaling runtime, with the number of constituents, can still prevent quantum simulations of large scale systems. We propose a strategy to extend the scope of quantum computational methods to large scale simulations using a machine learning potential, trained on quantum simulation data. The challenge of applying machine learning potentials in today's quantum setting arises from the several sources of noise affecting the quantum computations of electronic energies and forces. We investigate the trainability of a machine learning potential selecting various sources of noise: statistical, optimization and hardware noise.Finally, we construct the first machine learning potential from data computed on actual IBM Quantum processors for a hydrogen molecule. This already would allow us to perform arbitrarily long and stable molecular dynamics simulations, outperforming all current quantum approaches to molecular dynamics and structure optimization.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Julian Schuhmacher (7 papers)
  2. Guglielmo Mazzola (43 papers)
  3. Francesco Tacchino (42 papers)
  4. Olga Dmitriyeva (1 paper)
  5. Tai Bui (4 papers)
  6. Shanshan Huang (13 papers)
  7. Ivano Tavernelli (88 papers)
Citations (6)

Summary

We haven't generated a summary for this paper yet.