Non-Adaptive Measurement-Based Quantum Computation
- Measurement-Based NBQC is a quantum computation model that uses a fixed, pre-determined set of measurement settings on highly entangled resource states to execute operations in a single parallel round.
- The approach leverages structured resource states and binary matrix pre-processing to map classical inputs to measurement bases, clearly analyzing resource hierarchies and complexity trade-offs.
- By constraining operations within specific levels of the Clifford hierarchy, NBQC delineates computational boundaries through deterministic Boolean function evaluations and qubit resource minimization.
Measurement-Based Quantum Computation (MBQC) is a paradigm in which quantum computation is driven by measurements on a highly entangled multi-qubit resource state, typically a graph or cluster state. The measurement primitives are local and, dependent on the MBQC variant, may be either adaptive (measurement bases determined by previous outcomes) or non-adaptive (measurement settings fixed in advance). Measurement-Based Non-Adaptive Quantum Computation (NBQC, sometimes abbreviated as non-adaptive MBQC) is defined by the absence of adaptivity: all measurement settings are set at the outset, and the measurement process can be conducted in a single parallel round. NBQC is particularly valuable for analyzing the minimal and structural quantum resources required for specific computational tasks, enabling precise delineation of quantum-classical computational boundaries and resource trade-offs (Frembs et al., 2022, Hoban et al., 2013).
1. Formal Structure of Non-Adaptive MBQC
A non-adaptive MBQC computation employs a fixed -qubit entangled resource state (frequently a stabilizer or graph state), with bits of classical input used to select the measurement settings for each local measurement. The classical side-processing can be decomposed into:
- Pre-processing: An binary matrix maps each input to measurement settings .
- Measurement: Each qubit is measured in one of two allowable basis choices (Pauli-type observables) yielding outcome .
- Post-processing: The parity (possibly with an offset) of the measurement outcomes yields the final output .
In the deterministic regime, the distributed measurement defines a global operator such that . Notably, because all settings are fixed by the input and do not depend on previous measurement results, the computation is temporally flat and can be executed in a single round [(Frembs et al., 2022), Def. 1].
2. Hierarchies via Clifford Hierarchy Constraints
A central insight of NBQC is that restricting measurement bases to unitary rotations within a given level of the single-qubit Clifford hierarchy yields a strict correspondence between computational power and resource complexity. Each basis choice is constrained to , with the Pauli group and defined recursively by .
- Level (Clifford): Only stabilizer measurements allowed. This is classically simulable by the Gottesman-Knill theorem.
- Level (includes certain non-Clifford gates): Adaptive MBQC with suffices for universality, but non-adaptive remains limited.
This axis—the minimum hierarchy depth enabling a given computation—is often termed the "magic" or "non-Cliffordness" resource axis.
3. Boolean Function Computability and Resource Hierarchies
The achievable Boolean functions are sharply characterized:
- Stabilizer MBQC / Level-2: Only quadratic polynomials in the input can be computed deterministically:
If is not quadratic, the maximum achievable success probability with stabilizer MBQC is , where NQ is the minimal Hamming distance to any quadratic function [(Frembs et al., 2022), Cor. 1].
- Level- NBQC: Deterministic computation is possible only for with -degree at most :
Polynomials of degree exceeding cannot be computed deterministically at that level [(Frembs et al., 2022), Thm. 4].
- Resource State and Qubit Count: For quadratic functions, the minimal qubit count is related to the rank of the associated symmetric quadratic form over , specifically [(Frembs et al., 2022), Thm. 5]. More generally, for arbitrary , the Walsh–Fourier expansion determines the minimal GHZ state size: equals the minimal number of nonzero Walsh coefficients achievable by adding -zero polynomials [(Frembs et al., 2022), Thm. 8]. As a result, there exist functions with degree but , so minimal qubit count is not strictly monotonic in degree.
This structure establishes a bi-axial "resource hierarchy"—vertical axis is the Clifford hierarchy level; horizontal axis is minimal required qubit count.
4. Resource Complexity, Flow, and Physical Constraints
The patterns of measurements and topology of the resource state are subject to further optimization:
- Flow vs. gFlow: A graph has flow if measurement dependencies can be arranged such that each logical qubit requires only physical qubits at any point ( inputs/outputs). Patterns without flow but with more general gflow may require up to qubits present, being the total pattern size (Houshmand et al., 2017). Clifford/Pauli measurements (X/Y) can be performed first and absorbed into local graph transformations, further affecting resource requirements.
- Trade-Offs: For highly constrained physical architectures, MBQC variants supporting flow are much more hardware-efficient due to the ability to realize the computation on a minimal set of qubits at any one time (`on-the-fly extension') (Houshmand et al., 2017).
5. Non-Adaptive MBQC, IQP*, and Complexity-Theoretic Separations
NBQC includes families such as IQP* ("Instantaneous Quantum Polynomial-time") circuits—commuting gate circuits diagonal in X or Z basis—which are efficiently implemented via non-adaptive MBQC (Hoban et al., 2013). IQP* output distributions are believed to be hard to exactly sample classically (unless PH collapses to the third polynomial hierarchy level), even though the underlying resource states are separable in the measurement basis and cannot violate Bell inequalities. Thus, non-classicality and computational advantage can manifest without entanglement or contextuality in the standard sense [(Hoban et al., 2013), Theorem 1].
A tight summary of complexity-class separations in this setting:
| MBQC Subtheory | Determinism | Hierarchy Level | Complexity | Computable |
|---|---|---|---|---|
| Non-adaptive, Clifford | Yes | P | Quadratic polynomials | |
| Non-adaptive, -level | Yes | BQP-hard () | deg polynomials | |
| IQP* family (non-adapt.) | Sampling | Diagonal gates | PP-hard sampling | Parity-of-product fns |
| Non-adaptive, unbounded D | Yes | unbounded | Exponential resources | Arbitrary Boolean |
| Adaptive, | Yes | BQP-complete | Universal |
6. Byproducts, Causality, and the Necessity of Adaptivity
A fundamental distinction between NBQC and general (adaptive) MBQC is the management of Pauli byproducts arising from the inherent randomness of quantum measurement. In general MBQC, these byproducts necessitate classical feed-forward: measurement outcomes determine future measurement bases to ensure deterministic computation. Morimae's no-go theorem proves that it is impossible to construct an MBQC resource state that is universally byproduct-free without violating the no-signaling principle; thus, non-adaptive MBQC cannot be universal for quantum computation (in the circuit sense) (Morimae, 2012).
However, for certain circuit families (notably IQP*), all logical byproducts commute with the unitaries present, so adaptivity is not required and the overall operation can be implemented in a single measurement round (Hoban et al., 2013).
7. Connections to Classical and Nonclassical Computing
Classical analogues of MBQC, referred to as Measurement-Based Classical Computation (MBCC), arise by dephasing the resource state in the measurement basis, yielding purely classical probability distributions. Certain MBCC families exhibit distributions that cannot be efficiently simulated classically (unless PH collapses), despite lacking quantum discord or Bell-inequality violation, highlighting the essential role of quantum sampling complexity even in nonadaptive models (Hoban et al., 2013).
References
- Hierarchies of resources for measurement-based quantum computation (Frembs et al., 2022)
- Measurement-based classical computation (Hoban et al., 2013)
- Measurement-based quantum computation cannot avoid byproducts (Morimae, 2012)
- Minimal physical resources for the realisation of measurement-based quantum computation (Houshmand et al., 2017)