Papers
Topics
Authors
Recent
Search
2000 character limit reached

Scalable Neural Decoder for Topological Surface Codes

Published 18 Jan 2021 in quant-ph and cond-mat.stat-mech | (2101.07285v2)

Abstract: With the advent of noisy intermediate-scale quantum (NISQ) devices, practical quantum computing has seemingly come into reach. However, to go beyond proof-of-principle calculations, the current processing architectures will need to scale up to larger quantum circuits which in turn will require fast and scalable algorithms for quantum error correction. Here we present a neural network based decoder that, for a family of stabilizer codes subject to depolarizing noise and syndrome measurement errors, is scalable to tens of thousands of qubits (in contrast to other recent machine learning inspired decoders) and exhibits faster decoding times than the state-of-the-art union find decoder for a wide range of error rates (down to 1%). The key innovation is to autodecode error syndromes on small scales by shifting a preprocessing window over the underlying code, akin to a convolutional neural network in pattern recognition approaches. We show that such a preprocessing step allows to effectively reduce the error rate by up to two orders of magnitude in practical applications and, by detecting correlation effects, shifts the actual error threshold, up to fifteen percent higher than the threshold of conventional error correction algorithms such as union find or minimum weight perfect matching, even in the presence of measurement errors. An in-situ implementation of such machine learning-assisted quantum error correction will be a decisive step to push the entanglement frontier beyond the NISQ horizon.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.