Papers
Topics
Authors
Recent
Search
2000 character limit reached

Non-Adaptive Measurement-Based Quantum Computation

Updated 24 March 2026
  • Measurement-Based NBQC is a quantum computation model that uses a fixed, pre-determined set of measurement settings on highly entangled resource states to execute operations in a single parallel round.
  • The approach leverages structured resource states and binary matrix pre-processing to map classical inputs to measurement bases, clearly analyzing resource hierarchies and complexity trade-offs.
  • By constraining operations within specific levels of the Clifford hierarchy, NBQC delineates computational boundaries through deterministic Boolean function evaluations and qubit resource minimization.

Measurement-Based Quantum Computation (MBQC) is a paradigm in which quantum computation is driven by measurements on a highly entangled multi-qubit resource state, typically a graph or cluster state. The measurement primitives are local and, dependent on the MBQC variant, may be either adaptive (measurement bases determined by previous outcomes) or non-adaptive (measurement settings fixed in advance). Measurement-Based Non-Adaptive Quantum Computation (NBQC, sometimes abbreviated as non-adaptive MBQC) is defined by the absence of adaptivity: all measurement settings are set at the outset, and the measurement process can be conducted in a single parallel round. NBQC is particularly valuable for analyzing the minimal and structural quantum resources required for specific computational tasks, enabling precise delineation of quantum-classical computational boundaries and resource trade-offs (Frembs et al., 2022, Hoban et al., 2013).

1. Formal Structure of Non-Adaptive MBQC

A non-adaptive MBQC computation employs a fixed NN-qubit entangled resource state ψ|\psi\rangle (frequently a stabilizer or graph state), with nn bits of classical input iF2ni \in \mathbb{F}_2^n used to select the measurement settings for each local measurement. The classical side-processing can be decomposed into:

  • Pre-processing: An N×nN\times n binary matrix PP maps each input ii to measurement settings c=PiF2Nc=Pi \in \mathbb{F}_2^N.
  • Measurement: Each qubit kk is measured in one of two allowable basis choices Mk(0),Mk(1)M_k(0), M_k(1) (Pauli-type observables) yielding outcome mkF2m_k \in \mathbb{F}_2.
  • Post-processing: The parity (possibly with an offset) of the measurement outcomes yields the final output o=kmk+m0 (mod 2)o = \sum_k m_k + m_0\ (\mathrm{mod}\ 2).

In the deterministic regime, the distributed measurement defines a global operator M(c(i))M(c(i)) such that M(c(i))ψ=(1)o(i)ψM(c(i))|\psi\rangle = (-1)^{o(i)}|\psi\rangle. Notably, because all settings are fixed by the input and do not depend on previous measurement results, the computation is temporally flat and can be executed in a single round [(Frembs et al., 2022), Def. 1].

2. Hierarchies via Clifford Hierarchy Constraints

A central insight of NBQC is that restricting measurement bases to unitary rotations within a given level DD of the single-qubit Clifford hierarchy yields a strict correspondence between computational power and resource complexity. Each basis choice Uk(ck)U_k(c_k) is constrained to C1DC_1^D, with C1C_1 the Pauli group and Ck+1C_{k+1} defined recursively by Ck+1={U:UPUCk PPaulis}C_{k+1} = \{U : UPU^\dagger \in C_k\ \forall P \in \mathrm{Paulis}\}.

  • Level D=2D=2 (Clifford): Only stabilizer measurements allowed. This is classically simulable by the Gottesman-Knill theorem.
  • Level D=3D=3 (includes certain non-Clifford gates): Adaptive MBQC with D=3D=3 suffices for universality, but non-adaptive remains limited.

This axis—the minimum hierarchy depth enabling a given computation—is often termed the "magic" or "non-Cliffordness" resource axis.

3. Boolean Function Computability and Resource Hierarchies

The achievable Boolean functions f:F2nF2f: \mathbb{F}_2^n \to \mathbb{F}_2 are sharply characterized:

  • Stabilizer MBQC / Level-2: Only quadratic polynomials in the input can be computed deterministically:

f(x)=ilixi+i<jqijxixj(li,qijF2).f(x) = \sum_i l_i x_i + \sum_{i<j} q_{ij} x_i x_j\qquad (l_i, q_{ij} \in \mathbb{F}_2).

If ff is not quadratic, the maximum achievable success probability with stabilizer MBQC is Psucc=1NQ(f)/2nP_\mathrm{succ} = 1 - \text{NQ}(f)/2^n, where NQ(f)(f) is the minimal Hamming distance to any quadratic function [(Frembs et al., 2022), Cor. 1].

  • Level-DD NBQC: Deterministic computation is possible only for ff with F2\mathbb{F}_2-degree at most DD:

f(x)=S[n],SDαSiSxi(αSF2).f(x) = \sum_{S\subseteq [n],\,|S|\leq D} \alpha_S \prod_{i \in S} x_i\qquad (\alpha_S \in \mathbb{F}_2).

Polynomials of degree exceeding DD cannot be computed deterministically at that level [(Frembs et al., 2022), Thm. 4].

  • Resource State and Qubit Count: For quadratic functions, the minimal qubit count is related to the rank rr of the associated symmetric quadratic form over F2\mathbb{F}_2, specifically N=r+1N = r+1 [(Frembs et al., 2022), Thm. 5]. More generally, for arbitrary ff, the Walsh–Fourier expansion determines the minimal GHZ state size: RGHZ(f)R_\text{GHZ}(f) equals the minimal number of nonzero Walsh coefficients achievable by adding F2\mathbb{F}_2-zero polynomials [(Frembs et al., 2022), Thm. 8]. As a result, there exist functions with degree d>dd > d' but R(f)<R(g)R(f) < R(g), so minimal qubit count is not strictly monotonic in degree.

This structure establishes a bi-axial "resource hierarchy"—vertical axis is the Clifford hierarchy level; horizontal axis is minimal required qubit count.

4. Resource Complexity, Flow, and Physical Constraints

The patterns of measurements and topology of the resource state are subject to further optimization:

  • Flow vs. gFlow: A graph has flow if measurement dependencies can be arranged such that each logical qubit requires only n+1n+1 physical qubits at any point (nn inputs/outputs). Patterns without flow but with more general gflow may require up to m2m-2 qubits present, mm being the total pattern size (Houshmand et al., 2017). Clifford/Pauli measurements (X/Y) can be performed first and absorbed into local graph transformations, further affecting resource requirements.
  • Trade-Offs: For highly constrained physical architectures, MBQC variants supporting flow are much more hardware-efficient due to the ability to realize the computation on a minimal set of qubits at any one time (`on-the-fly extension') (Houshmand et al., 2017).

5. Non-Adaptive MBQC, IQP*, and Complexity-Theoretic Separations

NBQC includes families such as IQP* ("Instantaneous Quantum Polynomial-time") circuits—commuting gate circuits diagonal in X or Z basis—which are efficiently implemented via non-adaptive MBQC (Hoban et al., 2013). IQP* output distributions are believed to be hard to exactly sample classically (unless PH collapses to the third polynomial hierarchy level), even though the underlying resource states are separable in the measurement basis and cannot violate Bell inequalities. Thus, non-classicality and computational advantage can manifest without entanglement or contextuality in the standard sense [(Hoban et al., 2013), Theorem 1].

A tight summary of complexity-class separations in this setting:

MBQC Subtheory Determinism Hierarchy Level Complexity Computable ff
Non-adaptive, Clifford Yes D=2D=2 \subseteq P Quadratic polynomials
Non-adaptive, DD-level Yes DD BQP-hard (D3D\geq 3) deg D\leq D polynomials
IQP* family (non-adapt.) Sampling Diagonal gates PP-hard sampling Parity-of-product fns
Non-adaptive, unbounded D Yes unbounded Exponential resources Arbitrary Boolean ff
Adaptive, D3D \geq 3 Yes DD BQP-complete Universal

6. Byproducts, Causality, and the Necessity of Adaptivity

A fundamental distinction between NBQC and general (adaptive) MBQC is the management of Pauli byproducts arising from the inherent randomness of quantum measurement. In general MBQC, these byproducts necessitate classical feed-forward: measurement outcomes determine future measurement bases to ensure deterministic computation. Morimae's no-go theorem proves that it is impossible to construct an MBQC resource state that is universally byproduct-free without violating the no-signaling principle; thus, non-adaptive MBQC cannot be universal for quantum computation (in the circuit sense) (Morimae, 2012).

However, for certain circuit families (notably IQP*), all logical byproducts commute with the unitaries present, so adaptivity is not required and the overall operation can be implemented in a single measurement round (Hoban et al., 2013).

7. Connections to Classical and Nonclassical Computing

Classical analogues of MBQC, referred to as Measurement-Based Classical Computation (MBCC), arise by dephasing the resource state in the measurement basis, yielding purely classical probability distributions. Certain MBCC families exhibit distributions that cannot be efficiently simulated classically (unless PH collapses), despite lacking quantum discord or Bell-inequality violation, highlighting the essential role of quantum sampling complexity even in nonadaptive models (Hoban et al., 2013).

References

Topic to Video (Beta)

No one has generated a video about this topic yet.

Whiteboard

No one has generated a whiteboard explanation for this topic yet.

Follow Topic

Get notified by email when new papers are published related to Measurement-Based NBQC (MBQC).