Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
118 tokens/sec
GPT-4o
12 tokens/sec
Gemini 2.5 Pro Pro
24 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
3 tokens/sec
DeepSeek R1 via Azure Pro
33 tokens/sec
2000 character limit reached

Fusion-Based Quantum Computing

Updated 28 July 2025
  • Fusion-based quantum computing is a modular paradigm that uses entangling measurements (fusions) on constant-size resource states to construct universal quantum operations.
  • It integrates advanced error correction, adaptive protocols, and deterministic fusion flows to overcome the inherent non-determinism of photonic measurements.
  • The approach enables scalable, fault-tolerant architectures with efficient resource state engineering, providing high success probabilities crucial for next-generation quantum circuits.

Fusion-based quantum computing (FBQC) is a model of universal quantum computation in which computation proceeds by performing entangling measurements—referred to as "fusions"—on qubits drawn from small, constant-sized entangled resource states. The model is most developed for photonic and modular quantum hardware, where it provides a scalable and fault-tolerant approach distinct from both the circuit model and traditional measurement-based quantum computation (MBQC). FBQC architectures are founded on a network of resource states connected by fusion operations, utilizing advanced error correction, adaptive protocols, and modular hardware. This entry surveys the core theoretical frameworks, resource engineering, fusion procedures, hardware and network architectures, and leading schemes for fault-tolerant and scalable implementation.

1. Core Theoretical Principles and Formulation

FBQC generalizes the MBQC paradigm, replacing the need for a monolithic cluster state with a modular network constructed dynamically via fusions. A "fusion" is a two- or multi-qubit measurement—most commonly a Bell state measurement (BSM)—which entangles, merges, or projects qubits from distinct resource states. The stabilization formalism is crucial: each resource state is specified by a stabilizer group R\mathcal{R}; fusions define a "fusion group" F\mathcal{F}; and after fusion, the effective code is determined by S={gR:[g,f]=0,fF}S = \{g \in \mathcal{R} : [g, f] = 0, \forall f \in \mathcal{F}\} and the set of check operators C=RFC = \mathcal{R} \cap \mathcal{F} (Bartolucci et al., 2021).

The mathematical structure of fusion gates is closely related to the ZX calculus: key fusion operations (merge, split, rotate) correspond to ZX generators, allowing ZX diagrams to be interpreted as sequences of physical fusion operations (Beaudrap et al., 2019, Felice et al., 20 Sep 2024). The PF ("Pauli Fusion") model, an abstract but operational formalism, encapsulates these ideas with primitive completely positive trace-preserving (CPTP) maps whose Kraus operators closely match ZX calculus generators (for example, KV,0=++++K_{V,0} = |+\rangle\langle++| + |-\rangle\langle--| and KV,1=++++K_{V,1} = |+\rangle\langle+-| + |-\rangle\langle-+|).

A crucial aspect is non-determinism: most linear-optical fusion gates are probabilistic with heralded (measured) outcomes. Deterministic quantum computation is achieved by ensuring the existence of a PF-flow or general flow in the fusion network: a circuital or graphical structure (formalized as a partial order with corrector sets) such that any random sequence of measurement outcomes can be corrected via classical feedforward and Pauli frame updates to ensure the intended quantum transformation is achieved (Beaudrap et al., 2019, Felice et al., 20 Sep 2024).

2. Fusion Gates, Measurement Protocols, and Success Probabilities

FBQC relies on entangling measurements—predominantly Type-I and Type-II fusions. In photonic settings, these are often implemented by linear optics: interfering photons from two resource states on a beamsplitter and performing joint measurement (photon number detection) which, depending on the observed pattern, projects the input qubits into a Bell state or a mixture of target and "failure" states (Meng et al., 2023, Thomas et al., 18 Mar 2024).

The prototypical linear-optic BSM has a 50% ideal success probability for dual-rail encoding (Melkozerov et al., 21 Mar 2024, Rimock et al., 21 Jun 2024, Üstün et al., 22 May 2025). Boosted fusion gates—using ancillary photons or more sophisticated detection—can lift this to >70%>70\%, but at increased resource overhead and loss susceptibility. Generalizations of Type-II fusion for arbitrary qudit dimension dd allow for fusion with success probability bounded by 2/d22/d^2 (even dd) or $2/d(d+1)$ (odd dd); extra-dimensional correction and arranging for ancillary qudit states (e.g., via time-bin multiplexing and spin-cavity systems) can further improve practical success for d=3,5d=3,5 and above (Üstün et al., 22 May 2025).

Encoded fusion schemes embed each logical qubit into a block of mm-qubit GHZ or parity codewords, allowing fusion of logical (encoded) qubits to proceed with much higher success and loss tolerance, e.g. Ps=12mnP_s = 1 - 2^{-mn} (Song et al., 2 Aug 2024). Multi-block arrangements support error correction on-the-fly during fusion, as each block acts as an independent check of the fusion outcome.

Repeat-until-success (RUS) protocols exploit quantum emitter sources and may attain near-unit fusion success with low average attempt numbers, at the cost of increased timing or resource requirements (Felice et al., 20 Sep 2024).

3. Resource State Engineering and Fault-Tolerant Network Structures

FBQC builds large-scale logical entanglement by "stitching" together small, constant-size resource states—typically graph states (e.g., 4-star, 6-ring, caterpillar, loopy diamond), GHZ states, or error-corrected codewords (Bartolucci et al., 2021, Wein et al., 11 Dec 2024, Song et al., 2 Aug 2024). Deterministically generated resource states, such as those created by quantum dots, atom-cavity systems, or optomechanical setups, offer modular and scalable construction (Meng et al., 2023, Thomas et al., 18 Mar 2024, Pavlovich et al., 1 May 2025, Wein et al., 11 Dec 2024, Chan et al., 22 Jul 2025).

Local encoding, such as the generalized Shor code {n,m}\{n,m\}, enables concatenated protection against both loss and Pauli error. Blocklet-concatenation protocols introduce a hierarchy of code blocks, with each "blocklet" a constant-sized resource state encoding a stabilizer code; by chaining and fusing these, the logical code distance scales favorably, and erasure thresholds of up to 19.1% have been demonstrated for 10-qubit blocklets (Litinski, 16 Jun 2025).

Topological codes are naturally embedded in the fusion network; for example, the three-dimensional Raussendorf-Harrington-Goyal lattice is constructed by fusing 4-star or 6-ring resource states according to a pattern that forms logical code faces and volumes, with high thresholds to error and loss (Song et al., 2 Aug 2024). Adaptive and exposure-based strategies optimize the selection of fusion measurement bases to minimize the effect of loss clusters, with calculated loss per photon thresholds (LPPT) up to 17% for large local encodings (Bartolucci et al., 13 Jun 2025).

4. Physical Implementations: Hardware, Switching, and Sources

Photonic FBQC implementations leverage modular hardware: single-photon sources (quantum dots, optomechanical sources, spin-embedded photon emitters), linear-optical circuits, and photon detection networks (Bartolucci et al., 2021, Meng et al., 2023, Melkozerov et al., 21 Mar 2024, Wein et al., 11 Dec 2024, Chan et al., 22 Jul 2025).

Time-bin qubit encoding is used extensively for both robustness and source reconfigurability, especially in quantum dot architectures where spin manipulation and optical π\pi-rotations produce high-fidelity, sequential photonic cluster states (Chan et al., 22 Jul 2025, Meng et al., 2023). Switch networks and optical muxing are required for both resource state generation and for routing photons to fusion gates; switch depth and associated loss are crucial, as they limit the achievable error thresholds (Bartolucci et al., 2021). Target component metrics include state preparation/measurement fidelities of 99.98%, HOM interference visibility of 99.5%, and chip-to-chip interconnect fidelities exceeding 99.7% in silicon photonics (Alexander et al., 26 Apr 2024).

Optomechanical devices offer a novel solution: acoustic (phonon) modes "cache" probabilistically prepared quantum states, which are then transferred to optical modes via controllable beamsplitter interactions, effectively decoupling heralded (state preparation) and deterministic (readout) stages (Pavlovich et al., 1 May 2025).

5. Error Models, Fault-Tolerance, and Resource-Efficiency

Key error channels in FBQC include photon loss, imperfect fusion (fusion erasure and basis flip), photon distinguishability, spin errors (for emitter systems), and dephasing. The error correction infrastructure operates at two levels: (1) local fusion-level encoding (Shor, parity, graph codes), and (2) global network-level code (surface code, RHG lattice, blocklet product codes).

Thresholds are quantifiable: unboosted, nonencoded schemes exhibit LPPTs below 1%; boosted and locally encoded ({2,2} Shor code) schemes reach 2.7%–7.5%; exposure-based and adaptive schemes using {7,4} encoding attain LPPTs up to 17.4% (Bartolucci et al., 13 Jun 2025). Encoded fusion approaches sharply increase logarithmic loss tolerance (14% for moderate encoding) with fewer photons than boosting (Song et al., 2 Aug 2024). Practical hardware requirements are reduced drastically by employing deterministic, spin-embedded sources—e.g., achieving near-deterministic 6-ring resource state generation with only 12 such devices (Wein et al., 11 Dec 2024).

Critical performance metrics and formulas include:

Scheme LPPT (%) Resource State Source Requirement
Unencoded Baseline 0.79 6-ring (no boost) >2000>2000 single-photon srcs
{2,2}-Encoded, Expo. 7.5 6-ring, exposure-based adaptive 12 spin-embedded sources
Blocklet Concatenation 11.5–19.1 8-,10-,12-qubit blocklets (concat codes) Constant per blocklet

Erasure probabilities for encoded fusion are analyzed as

p0=1(1ploss)(11psucc2),p_0 = 1 - (1-p_\mathrm{loss}) \cdot \left( 1 - \frac{1-p_\mathrm{succ}}{2} \right),

and

penc=(1(1p0)2)2+1(1p02)22p_\mathrm{enc} = \frac{(1-(1-p_0)^2)^2 + 1-(1-p_0^2)^2}{2}

for the encoded net (Melkozerov et al., 21 Mar 2024).

6. Network Compilation, Universality, and Future Directions

FBQC frameworks are increasingly marrying formal verification methods (ZX calculus (Beaudrap et al., 2019), graph flows, dataflow programming (Felice et al., 20 Sep 2024)) with network compilation and optimization. Graphically verified protocols, explicit flow structures (PF-flow, XY-flow), and RUS protocols enable deterministic error correction and universality via modular, temporally-ordered fusion operations.

Universality proofs leverage the combinatorial construction of fusion networks simulating arbitrary quantum circuits and the ZX calculus translation between high-level algorithms and physical operations. The modular, constant-depth network structure paves the way for efficient compiler pipelines targeting photonic hardware (Felice et al., 20 Sep 2024).

Research now centers on scaling deterministic photonic hardware (quantum emitters, optomechanics), integrating switch networks with minimal loss, and advancing adaptive, code-concatenated protocols. With demonstrated error thresholds exceeding those of traditional surface codes and sharply reduced resource footprints, advanced FBQC schemes such as blocklet concatenation (Litinski, 16 Jun 2025), encoded fusion (Song et al., 2 Aug 2024), and exposure-based adaptivity (Bartolucci et al., 13 Jun 2025) offer promising blueprints for realizing scalable, fault-tolerant, and manufacturable quantum computers based on photonic and hybrid architectures.


Fusion-based quantum computing thereby constitutes a flexible and physically realistic paradigm, encompassing both the mathematical formalism and experimental architectures required for the next generation of quantum hardware. The model robustly integrates concepts from stabilizer theory, ZX calculus, and quantum error correction in a modular, resource-efficient, and scalable framework, with proven universality and rapidly improving pathway to photonic realization.