Papers
Topics
Authors
Recent
2000 character limit reached

Correcting a noisy quantum computer using a quantum computer

Published 10 Jun 2025 in quant-ph, cond-mat.dis-nn, and cond-mat.stat-mech | (2506.08331v1)

Abstract: Quantum computers require error correction to achieve universal quantum computing. However, current decoding of quantum error-correcting codes relies on classical computation, which is slower than quantum operations in superconducting qubits. This discrepancy makes the practical implementation of real-time quantum error correction challenging. In this work, we propose a decoding scheme that leverages the operations of the quantum circuit itself. Given a noisy quantum circuit $A$, we train a decoding quantum circuit $B$ using syndrome measurements to identify the logical operators needed to correct errors in circuit $A$. The trained quantum circuit $B$ can be deployed on quantum devices, such as superconducting qubits, to perform real-time decoding and error correction. Our approach is applicable to general quantum codes with multiple logical qubits and operates efficiently under various noise conditions, and the decoding speed matches the speed of the quantum circuits being corrected. We have conducted numerical experiments using surface codes up to distance 7 under circuit-level noise, demonstrating performance on par with the classical minimum-weight perfect matching algorithm. Interestingly, our method reveals that the traditionally classical task of decoding error-correcting codes can be accomplished without classical devices or measurements. This insight paves the way for the development of self-correcting quantum computers.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 1 tweet with 0 likes about this paper.