Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash 105 tok/s
Gemini 2.5 Pro 52 tok/s Pro
GPT-5 Medium 45 tok/s
GPT-5 High 34 tok/s Pro
GPT-4o 108 tok/s
GPT OSS 120B 473 tok/s Pro
Kimi K2 218 tok/s Pro
2000 character limit reached

Efficient classical calculation of the Quantum Natural Gradient (2011.02991v1)

Published 5 Nov 2020 in quant-ph and physics.comp-ph

Abstract: Quantum natural gradient has emerged as a superior minimisation technique in quantum variational algorithms. Classically simulating the algorithm running on near-future quantum hardware is paramount in its study, as it is for all variational algorithms. In this case, state-vector simulation of the P-parameter/gate ansatz circuit does not dominate the runtime; instead, calculation of the Fisher information matrix becomes the bottleneck, involving O(P3) gate evaluations, though this can be reduced to O(P2) gates by using O(P) temporary state-vectors. This is similar to the gradient calculation subroutine dominating the simulation of quantum gradient descent, which has attracted HPC strategies and bespoke simulation algorithms with asymptotic speedups. We here present a novel simulation strategy to precisely calculate the quantum natural gradient in O(P2) gates and O(1) state-vectors. While more complicated, our strategy is in the same spirit as that presented for gradients in Reference 6, and involves iteratively evaluating recurrent forms of the Fisher information matrix. Our strategy uses only "apply gate", "clone state" and "inner product" operations which are present in practically all quantum computing simulators. It is furthermore compatible with parallelisation schemes, like hardware acceleration and distribution. Since our scheme leverages a form of the Fisher information matrix for strictly unitary ansatz circuits, it cannot be simply extended to density matrix simulation of quantum natural gradient with non-unitary circuits.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.

Authors (1)

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube