Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Partially Concatenated Calderbank-Shor-Steane Codes Achieving the Quantum Gilbert-Varshamov Bound Asymptotically (2107.05174v2)

Published 12 Jul 2021 in quant-ph, cs.IT, and math.IT

Abstract: In this paper, we utilize a concatenation scheme to construct new families of quantum error correction codes achieving the quantum Gilbert-Varshamov (GV) bound asymptotically. We concatenate alternant codes with any linear code achieving the classical GV bound to construct Calderbank-Shor-Steane (CSS) codes. We show that the concatenated code can achieve the quantum GV bound asymptotically and can approach the Hashing bound for asymmetric Pauli channels. By combing Steane's enlargement construction of CSS codes, we derive a family of enlarged stabilizer codes achieving the quantum GV bound for enlarged CSS codes asymptotically. As applications, we derive two families of fast encodable and decodable CSS codes with parameters $\mathscr{Q}_1=[[N,\Omega(\sqrt{N}),\Omega( \sqrt{N})]],$ and $\mathscr{Q}_2=[[N,\Omega(N/\log N),\Omega(N/\log N)/\Omega(\log N)]].$ We show that $\mathscr{Q}_1$ can be encoded very efficiently by circuits of size $O(N)$ and depth $O(\sqrt{N})$. For an input error syndrome, $\mathscr{Q}_1$ can correct any adversarial error of weight up to half the minimum distance bound in $O(N)$ time. $\mathscr{Q}_1$ can also be decoded in parallel in $O(\sqrt{N})$ time by using $O(\sqrt{N})$ classical processors. For an input error syndrome, we proved that $\mathscr{Q}_2$ can correct a linear number of ${X}$-errors with high probability and an almost linear number of ${Z}$-errors in $O(N )$ time. Moreover, $\mathscr{Q}_2$ can be decoded in parallel in $O(\log(N))$ time by using $O(N)$ classical processors.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Jihao Fan (10 papers)
  2. Jun Li (778 papers)
  3. Ya Wang (97 papers)
  4. Yonghui Li (241 papers)
  5. Min-Hsiu Hsieh (148 papers)
  6. Jiangfeng Du (185 papers)
Citations (2)

Summary

We haven't generated a summary for this paper yet.