Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
133 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Fault-tolerant Post-Selection for Low Overhead Magic State Preparation (2212.00813v1)

Published 1 Dec 2022 in quant-ph

Abstract: We introduce a framework for fault-tolerant post-selection (FTPS) of fault-tolerant codes and channels -- such as those based on surface-codes -- using soft-information metrics based on visible syndrome and erasure information. We introduce several metrics for ranking configurations of syndromes and erasures. In particular, we introduce the \emph{logical gap} (and variants thereof) as a powerful soft-information metric for predicting logical error rates of fault-tolerant channels based on topological error-correcting codes. The logical gap is roughly the unsigned weight difference between inequivalent logical corrections and is adaptable to any tailored noise model or decoder. We deploy this framework to prepare high-quality surface code magic states with low overhead under a model of independent and identically distributed (\emph{i.i.d.}) Pauli and erasure errors. Post-selection strategies based on the logical gap can suppress the encoding error rate of a magic state preparation channel to the level of the physical error rate with low overhead. For example, when operating at $60\%$ the bulk threshold of the corresponding surface code, an overall reduction of the encoding error rate by a factor of $15$ is achievable with a relative overhead factor of ${< 2}$ (approximately $23$ times less than that of simple syndrome-counting rules). We analyze a schematic buffer architecture for implementing post-selection rules on magic state factories in the context of magic state distillation. The FTPS framework can be utilized for mitigating errors in more general fault-tolerant logical channels.

Citations (10)

Summary

We haven't generated a summary for this paper yet.