Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Biased Gottesman-Kitaev-Preskill repetition code (2212.11397v2)

Published 21 Dec 2022 in quant-ph

Abstract: Continuous-variable quantum computing architectures based upon the Gottesmann-Kitaev-Preskill (GKP) encoding have emerged as a promising candidate because one can achieve fault-tolerance with a probabilistic supply of GKP states and Gaussian operations. Furthermore, by generalising to rectangular-lattice GKP states, a bias can be introduced and exploited through concatenation with qubit codes that show improved performance under biasing. However, these codes (such as the XZZX surface code) still require weight-four stabiliser measurements and have complex decoding requirements to overcome. In this work, we study the code-capacity behaviour of a rectangular-lattice GKP encoding concatenated with a repetition code under an isotropic Gaussian displacement channel. We find a numerical threshold of $\sigma = 0.599$ for the noise's standard deviation, which outperforms the biased GKP planar surface code with a trade-off of increased biasing at the GKP level. This is all achieved with only weight-two stabiliser operators and simple decoding at the qubit level. Furthermore, with moderate levels of bias (aspect ratio $\leq 2.4$) and nine or fewer data modes, significant reductions in logical error rates can still be achieved for $\sigma \leq 0.3$, opening the possibility of using GKP-biased repetition codes as a simple low-level qubit encoding for further concatenation.

Citations (11)

Summary

We haven't generated a summary for this paper yet.