Gaussian Boson Sampling: Theory & Applications
- Gaussian Boson Sampling is a photonic quantum computation scheme that leverages squeezed vacuum states and a passive linear interferometer to sample photon-number distributions with probabilities determined by the matrix hafnian.
- It enhances experimental efficiency by directly using squeezed states, which increases multiphoton event probabilities and reduces resource overhead compared to traditional Fock-based schemes.
- The computational complexity of GBS is rooted in the #P-hard nature of hafnian evaluations, suggesting classical simulation is intractable and making it a strong candidate for demonstrating quantum advantage.
Gaussian Boson Sampling (GBS) is a photonic quantum computational scheme that samples from the photon-number distribution of multimode Gaussian states—specifically, those comprising squeezed vacuum states injected into a passive linear interferometer. In contrast to traditional Boson Sampling, which relies on single-photon Fock states and whose output probabilities are governed by the matrix permanent, GBS employs the full nonclassical structure of squeezed states, leading output probabilities to depend on the matrix hafnian. The computational task of sampling from the resulting distribution is conjectured to be classically intractable due to #P-hardness properties, and GBS is distinguished by its experimental efficiency and potential for near-term photonic quantum advantage (Hamilton et al., 2016).
1. Theoretical Framework and Definition
GBS is mathematically defined by the probability of observing a photon number pattern at the interferometer outputs: where is the covariance matrix of the overall Gaussian state, is a submatrix constructed from the interferometer transformation and the squeezing parameters, and the hafnian encodes weighted sums over perfect matchings. In the case of single-mode squeezed inputs and collision-free detection events (at most one photon per mode), is a symmetric matrix whose entries are determined by the mode transformations and input squeezing.
The sampling problem is: given a specification of the Gaussian input (squeezing, displacement), linear optics network (passive unitary or sub-unitary), and ideal photon-number-resolving detectors, generate samples from this probability distribution.
2. Squeezed States as Quantum Resource
Single-mode squeezed vacuum states are nonclassical resources defined by reduced quadrature noise below the vacuum limit. In the GBS protocol, squeezed input states (with mean photon number ) replace probabilistically heralded single-photon sources. This direct usage enables the retention and exploitation of higher-order Fock components, sharply increasing the overall multiphoton event generation probability. The nonclassical character of squeezing ensures that the output statistics are sensitive to quantum interference effects not present in classical light, and is fundamental to the computational complexity of the GBS output distribution (Hamilton et al., 2016).
3. Hafnian Function and Complexity
The theoretical advance underlying GBS is the identification of the hafnian as the matrix function determining output probabilities. For an -mode Gaussian state,
where the hafnian, for a symmetric matrix , is given by
with the sum running over all perfect matchings . The hafnian generalizes the permanent, as for a block-structured matrix $\left(\begin{smaLLMatrix} 0 & G \ G^T & 0 \end{smaLLMatrix}\right)$ one recovers . Evaluating the hafnian is known to be #P-hard, and the paper shows that approximating GBS output probabilities to multiplicative error is at least as hard as for permanents.
In system sizes relevant for near-term experiments (e.g., , ), classical sampling from GBS distributions is expected to be infeasible, subject to complexity conjectures aligning with those in standard Boson Sampling.
4. Experimental Protocol and Implementation
A GBS experiment comprises:
- Input: single-mode squeezed vacuum states, typically generated using parametric down-conversion or waveguided nonlinear crystals.
- Interferometer: An -mode linear optical network described by a random (e.g., Haar-distributed) unitary matrix with , designed to distribute input photons and suppress multi-photon occupancy per mode.
- Detection: Photon-number-resolving detectors (e.g., superconducting nanowire devices) measure the output in the Fock basis. The architecture is arranged so that most output patterns are collision-free.
Resource and scaling requirements are as follows:
Parameter | GBS Requirement | Postselected Fock BS | Scattershot BS |
---|---|---|---|
Number of inputs | squeezed states | single photons | two-mode squeezed states |
Number of modes | |||
Sampling space | Reduced (unconditioned on input locations) | Binomial over outputs | Binomial over inputs |
Experimental challenges include achieving low-loss, high-purity squeezed states, maintaining high interferometer stability/rank (), tuning squeezing parameters to optimal gain, handling dark counts, and scaling detection arrays.
5. Advantages over Fock Boson Sampling
GBS improves upon previous photonic sampling schemes through several mechanisms:
- Generation Probability: The use of squeezed states directly, rather than as heralded single photons, leads to exponentially higher probabilities for relevant multiphoton events—according to the negative binomial vs. binomial statistics of GBS and Fock inputs, respectively.
- Sampling Space Reduction: GBS does not require conditioning on specific photon input locations, avoiding the exponential scaling of the space of valid input-output configurations seen in Scattershot Boson Sampling.
- Resource Efficiency: Fewer squeezed states and modes are required to reach comparable photon-numbers and event rates, mainly because all multi-photon configurations contribute to valid samples, rather than the strictly rare "one-photon-per-mode" events isolated in conventional boson sampling.
- Measurement Time: The sample space reduction directly lessens the measurement time required to obtain meaningful statistics.
6. Computational Hardness and Complexity-Theoretic Status
The output probabilities in GBS are #P-hard to compute because they are functions of the hafnian. By analogy with Boson Sampling's complexity-theoretic argument (where efficient simulation would collapse the polynomial hierarchy), GBS inherits this hardness under the assumption that approximating the hafnian on typical Gaussian matrices is classically infeasible.
The paper articulates two key conjectures for approximate (as opposed to exact) GBS to inherit #P-hardness:
- Hafnian-of-Gaussians Conjecture: Approximating the hafnian of a Gaussian matrix is #P-hard even up to multiplicative error.
- Anti-concentration Conjecture: Hafnians of typical Gaussian matrices are not concentrated near zero, which is necessary for Stockmeyer-type reductions to go through.
These conjectures are critical to ruling out potential weaknesses in the hardness argument for near-term devices and remain an open area for rigorous proof.
7. Future Directions and Open Problems
Open research questions and prospects outlined include:
- Approximate sampling complexity: Establishing the hardness of GBS sampling to within multiplicative/additive error in the output probabilities, supported by rigorous proofs akin to the original Boson Sampling hardness arguments.
- Loss and noise tolerance: Determining the threshold loss and noise levels that preserve quantum computational advantage, beyond which the output statistics admit efficient classical descriptions (potentially entering the "classical simulable" regime).
- Protocol Extensions: GBS encompasses a family of protocols, unifying scattershot, photon-added/subtracted, and displacement-based schemes, and holds promise for further generalizations (e.g., reverse GBS—Fock input with Gaussian measurements).
- Applications: GBS's structure suggests direct applications in molecular vibronic spectroscopy and graph-theoretic problems (e.g., dense subgraph identification), owing to the mapping between detection statistics and combinatorial features (hafnian counts) of encoded matrices.
A plausible implication is that the rigorous elucidation of the hafnian's average-case complexity and anti-concentration properties will critically determine the future of GBS as a near-term candidate for demonstrating quantum advantage.
In summary, Gaussian Boson Sampling is defined by sampling photon-number patterns at the output of a passive linear interferometer fed by squeezed vacuum states, where output probabilities are determined by the matrix hafnian. This method offers exponential improvements in event generation and resource efficiency and encodes computational problems that are #P-hard, standing as a leading approach for demonstrating quantum advantage with photonic devices (Hamilton et al., 2016). Future work is directed toward solidifying its complexity-theoretic foundation under realistic noise and loss, extending its protocol family, and exploiting new applications in high-dimensional quantum simulation and combinatorial optimization.