Preparing Code States via Seed-Entangler-Enriched Sequential Quantum Circuits: Application to Tetra-Digit Topological Error-Correcting Codes (2503.05374v2)
Abstract: We introduce a unified and efficient quantum circuit framework, termed the \emph{Seed-Entangler-Enriched Sequential Quantum Circuit} (SEESQC) to prepare quantum states in code space of topological error-correcting codes. Specifically, we apply SEESQC to prepare code states of Tetra-Digit models -- a broad class of long-range entangled stabilizer codes indexed by a four-digit parameter. These models are not rare but encompass Toric Codes across arbitrary dimensions and subsume the X-cube fracton code as special cases. Featuring a hierarchical structure of generalized entanglement renormalization group, many Tetra-Digit models host spatially extended excitations (e.g., loops, membranes, and exotic non-manifold objects) with constrained mobility and deformability, and exhibit system-size-dependent ground state degeneracies that scale exponentially with a polynomial in linear sizes. In this work, we first graphically and algebraically demonstrate quantum circuits for computational basis states, before generalizing to broader cases. Central to this framework is a key ingredient termed the \emph{seed-entangler} acting on a small number of qubits termed \textit{seeds}, enabling a systematic scheme to achieve arbitrary code states. Remarkably, the number of available seeds equals the number of logical qubits for the constructed examples, which leaves plenty of room for future investigation in theoretical physics, mathematics and quantum information science. With experimental feasibility via synthetic dimensions in modern quantum simulators, this framework transcends spatial dimensions, bridges liquid and non-liquid states, unifies gapped phases governed by distinct entanglement renormalization group schemes, and offers a pathway toward engineering topological phases and manipulating logical qubits.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Collections
Sign up for free to add this paper to one or more collections.