Papers
Topics
Authors
Recent
2000 character limit reached

Gaussian Boson Samplers Overview

Updated 3 January 2026
  • Gaussian Boson Samplers are photonic quantum devices that use squeezed vacuum inputs and linear optical networks to sample from complex multimode Gaussian states.
  • They employ hafnian-based probability calculations with advanced detection methods, ensuring computational tasks remain classically intractable under standard complexity assumptions.
  • Applications span graph optimization, molecular spectroscopy, and stochastic integration, with ongoing improvements in certification, detector modeling, and hybrid algorithmic approaches.

Gaussian Boson Samplers (GBS) are photonic quantum devices designed to sample from the photon-number distribution of large-scale multimode Gaussian states, specifically those generated from squeezed vacuum inputs, followed by passive linear optics and non-adaptive photon counting. GBS provides a computational task shown—under standard complexity-theoretic conjectures—to be classically intractable, generalizing the original Boson Sampling paradigm to encompass arbitrary Gaussian input states. Sampling rates, success probabilities, and scalability are significantly enhanced due to the use of squeezed light, positioning GBS as a leading candidate for demonstrating quantum advantage and exploring quantum-enhanced graph algorithms, molecular spectroscopy, and stochastic numerical integration.

1. Theoretical Foundations and Mathematical Formalism

A GBS device prepares mm optical modes, each in a single-mode squeezed vacuum state 0|0\rangle with squeezing parameter rir_i: S(ri)=exp[12ri(a2a2)]S(r_i) = \exp\left[\frac{1}{2} r_i(a^2 - a^{\dagger 2})\right]. The initial pure Gaussian state has covariance Vin=12diag(e2r,e+2r)V_{\text{in}} = \frac{1}{2} \operatorname{diag} (e^{-2r}, e^{+2r}) in the quadrature basis. These modes are interfered in an mm-mode passive linear optical network described by a unitary UU. The network action on mode operators is aijUijaja_i \to \sum_j U_{ij} a_j, and on covariance matrices via the real symplectic map SUS_U as Vout=SUVinSUV_{\text{out}} = S_U V_{\text{in}} S_U^\top.

Photon-number-resolving detectors at the outputs yield a click pattern S=(s1,,sm)S = (s_1,\ldots,s_m) with total detected photons n=isin = \sum_i s_i. The probability PSP_S of an outcome SS is governed by the hafnian: PS=1i=1msi!Haf(ΛS)2,P_S = \frac{1}{\prod_{i=1}^m s_i!} |\operatorname{Haf}(\Lambda_S)|^2, where ΛS\Lambda_S is constructed by repeating the ii-th row and column of A=UDUTA = U D U^T (D=diag(tanhri)D = \operatorname{diag}(\tanh r_i)) exactly sis_i times, and the hafnian is summed over all perfect matchings of S|S| vertices (Lund et al., 2013, Kruse et al., 2018, Hamilton et al., 2016). For two-mode squeezed vacuum inputs and postselection, the GBS model recovers "scattershot" and standard Boson Sampling in appropriate limits.

2. Computational Complexity and Intractability

Exact and approximate sampling from GBS output distributions are believed to be classically intractable, subject to conjectures paralleling those for the permanent-of-Gaussians in Aaronson-Arkhipov Boson Sampling. The output amplitude, being a hafnian of a submatrix containing independent complex Gaussian entries, embeds a #P-hard problem: computing the hafnian of a matrix generalizes permanent-computation, with Perm(G)=Haf(0G G0)\operatorname{Perm}(G) = \operatorname{Haf} \left( \begin{smallmatrix} 0 & G \ G^\top & 0 \end{smallmatrix} \right ) (Kruse et al., 2018).

The classical hardness persists in regimes where the number of modes mm scales quadratically with photon number nn, and it extends to approximate sampling under anti-concentration and average-case hardness conjectures (Grier et al., 2021, Lund et al., 2013). Hardness is preserved even in constant-collision outputs (multiple photons per mode) (Grier et al., 2021). Stockmeyer’s approximate counting applies, ensuring that efficient classical sampling would collapse the polynomial hierarchy.

3. Algorithmic Features, Detector Models, and Approximations

Photon-Number Resolving and Threshold Detection

Standard GBS requires photon-number-resolving detectors, with outcomes given by the hafnian formula above. In practice, threshold (on–off) detectors are more common and yield "click" patterns without photon-number resolution. Output probabilities then involve the Torontonian function, an inclusion–exclusion sum over principal minors of a matrix O=IΣ1O=I-\Sigma^{-1} built from the QQ-covariance: p(S)=Tor(O(S))detΣ,p(S) = \frac{\operatorname{Tor}(O_{(S)})}{\sqrt{\det \Sigma}}, where O(S)O_{(S)} is the submatrix for clicked modes (Quesada et al., 2018). In the collision-free regime, threshold and PNR detectors give nearly indistinguishable distributions, and the same complexity-theoretic hardness applies.

Imperfect and Realistic Detectors

Practical GBS implementations must contend with nonideal detector response, modeled as a convolution over ideal counts weighted by detector conditionals PnmP_{n|m}. The resultant output probability admits a functional form that interpolates between the Torontonian (on–off), the Kensingtonian (click-counting), and the hafnian (PNR) (Yeremenko et al., 2024, Bressanini et al., 2023). Detector imperfections such as finite resolution and dead time can be captured explicitly in this formalism and are essential for device validation and certification.

Randomized Classical Hafnian Estimators

For nonnegative matrices, randomized estimators for the hafnian—such as the Barvinok and Godsil–Gutman methods—sample random skew-symmetric matrices and estimate the hafnian via determinants. These estimators are unbiased but may have high variance; for random graph kernels, the variance increases only polynomially with size, enabling efficient classical approximation of low-order correlations, although pathological instances remain exponentially hard (Uvarov et al., 2023). This challenges the quantum advantage of GBS in regimes where only nonnegative kernels are relevant.

4. Experimental Implementations and Validation

Typical GBS platforms utilize single-mode squeezed vacuum sources (e.g., OPOs), integrated or free-space linear interferometers (networks of beam splitters and phase shifters), and high-efficiency detectors (Zhong et al., 2019). Current squeezing levels of r0.51r \sim 0.5-1 ($4-9$ dB) and transmission η0.8\eta \gtrsim 0.8 per photon enable rates of \simMHz for 20\lesssim 20-photon samples. Integrated photonics with on-chip sources and detectors are under active development and crucial for scalability (Lund et al., 2013).

Certification of GBS devices is intrinsically challenging due to the #P-hardness of simulating output distributions. Graph-theoretic certification methods evaluate feature vectors or graph kernels (linear and non-linear) derived from output patterns to discriminate genuine indistinguishable-Gaussian samples from plausible classical "spoofing" distributions. Empirical cloud separation and statistical learning techniques achieve high-confidence certification without full probability estimation (Giordani et al., 2022).

Validation using coarse-grained collision events and orbits in photocount space, with statistical tests such as Pearson χ2\chi^2 and Bayesian likelihood ratios, enables exclusion of positive-PP classical models, especially when minimal photon-number resolution is retained (Yeremenko et al., 2024).

5. Applications in Algorithmics and Optimization

GBS naturally encodes graph-theoretic quantities via hafnians, allowing the approximation of combinatorial problems such as maximum weight clique, densest subgraph, and molecular docking. For a graph GG with adjacency matrix AA, programmable GBS devices sample subgraphs SS with probabilities Haf(AS)2\propto |\operatorname{Haf}(A_S)|^2, biasing strongly toward dense substructures. Weight-biasing via diagonal scaling Ω\Omega further preferentially samples heavy cliques (Banchi et al., 2019). These quantum samples serve as seeds for hybrid classical-quantum heuristics, often outperforming purely classical random search in maximum clique and densest subgraph finding (Zhong et al., 2019, Raghuraman et al., 23 Jul 2025).

Moreover, GBS-based estimators can yield exponential speedups for integration problems involving high-order Gaussian moments, as the effective sample complexity for estimating certain Gaussian expectations using GBS output is exponentially less than for plain Monte Carlo, for specifically constructed function classes (Andersen et al., 26 Feb 2025).

In practical graph instances with nonnegative kernels, quantum-inspired classical algorithms—leveraging completely positive factorizations and efficient rejection sampling—can nearly match GBS sample complexity up to polynomial factors, and even match GBS performance for typical applications, limiting practical quantum advantage to carefully chosen, structured ensembles (Oh et al., 2023, Raghuraman et al., 23 Jul 2025).

6. Future Directions and Open Challenges

Major open challenges include:

  • Hardness for approximate sampling: Rigorous proof for the classical intractability of approximate sampling from GBS distributions with realistic noise, under physically reasonable models of squeezing, loss, and partial photon indistinguishability, is incomplete (Grier et al., 2021).
  • Loss-tolerant and hybrid protocols: Strategies for error mitigation, including active heralded correction and hybrid photonic schemes with ancillae or non-Gaussian operations, may expand the computational reach of GBS devices (Lund et al., 2013).
  • Efficient certification at scale: New validation frameworks beyond classical sampling—possibly exploiting connections to graph kernels, molecular spectra, or machine learning tasks—are needed for credible demonstration of quantum advantage in high-dimensional GBS.
  • Algorithmic optimizations: Compiling graph problems to GBS input parameters via low-rank or structured Takagi decompositions, and leveraging GBS devices as subroutines for classical or hybrid solvers, represents a promising strategy. Efficient low-rank factorizations could alleviate the O(M3)O(M^3) preprocessing bottleneck for structured graphs (Raghuraman et al., 23 Jul 2025).
  • Hardware advances: Integration of high-efficiency, scalable sources and detectors, loss-minimal interferometer architectures, and real-time reconfigurability are critical for scaling GBS devices to the quantum advantage frontier.

The continued development of theoretical tools for evaluating the practical and foundational power of Gaussian Boson Samplers, along with experimental advances in photonic integration and detection, remains central to the future of quantum computational supremacy demonstrations and quantum-enhanced combinatorial optimization.

Whiteboard

Topic to Video (Beta)

Follow Topic

Get notified by email when new papers are published related to Gaussian Boson Samplers.

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube