Papers
Topics
Authors
Recent
2000 character limit reached

Quantum time dynamics mediated by the Yang-Baxter equation and artificial neural networks (2401.17116v2)

Published 30 Jan 2024 in quant-ph, cond-mat.soft, cs.LG, and physics.comp-ph

Abstract: Quantum computing shows great potential, but errors pose a significant challenge. This study explores new strategies for mitigating quantum errors using artificial neural networks (ANN) and the Yang-Baxter equation (YBE). Unlike traditional error mitigation methods, which are computationally intensive, we investigate artificial error mitigation. We developed a novel method that combines ANN for noise mitigation combined with the YBE to generate noisy data. This approach effectively reduces noise in quantum simulations, enhancing the accuracy of the results. The YBE rigorously preserves quantum correlations and symmetries in spin chain simulations in certain classes of integrable lattice models, enabling effective compression of quantum circuits while retaining linear scalability with the number of qubits. This compression facilitates both full and partial implementations, allowing the generation of noisy quantum data on hardware alongside noiseless simulations using classical platforms. By introducing controlled noise through the YBE, we enhance the dataset for error mitigation. We train an ANN model on partial data from quantum simulations, demonstrating its effectiveness in mitigating errors in time-evolving quantum states, providing a scalable framework to enhance quantum computation fidelity, particularly in noisy intermediate-scale quantum (NISQ) systems. We demonstrate the efficacy of this approach by performing quantum time dynamics simulations using the Heisenberg XY Hamiltonian on real quantum devices.

Citations (1)

Summary

We haven't generated a summary for this paper yet.

Whiteboard

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 1 tweet with 0 likes about this paper.