Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

BosonSampling with Lost Photons (1510.05245v2)

Published 18 Oct 2015 in quant-ph and cs.CC

Abstract: BosonSampling is an intermediate model of quantum computation where linear-optical networks are used to solve sampling problems expected to be hard for classical computers. Since these devices are not expected to be universal for quantum computation, it remains an open question of whether any error-correction techniques can be applied to them, and thus it is important to investigate how robust the model is under natural experimental imperfections, such as losses and imperfect control of parameters. Here we investigate the complexity of BosonSampling under photon losses---more specifically, the case where an unknown subset of the photons are randomly lost at the sources. We show that, if $k$ out of $n$ photons are lost, then we cannot sample classically from a distribution that is $1/n{\Theta(k)}$-close (in total variation distance) to the ideal distribution, unless a $\text{BPP}{\text{NP}}$ machine can estimate the permanents of Gaussian matrices in $n{O(k)}$ time. In particular, if $k$ is constant, this implies that simulating lossy BosonSampling is hard for a classical computer, under exactly the same complexity assumption used for the original lossless case.

Citations (69)

Summary

We haven't generated a summary for this paper yet.