Papers
Topics
Authors
Recent
Search
2000 character limit reached

Heisenberg-limited Bayesian phase estimation with low-depth digital quantum circuits

Published 8 Jul 2024 in quant-ph | (2407.06006v1)

Abstract: Optimal phase estimation protocols require complex state preparation and readout schemes, generally unavailable or unscalable in many quantum platforms. We develop and analyze a scheme that achieves near-optimal precision up to a constant overhead for Bayesian phase estimation, using simple digital quantum circuits with depths scaling logarithmically with the number of qubits. We find that for Gaussian prior phase distributions with arbitrary widths, the optimal initial state can be approximated with products of Greenberger-Horne-Zeilinger states with varying number of qubits. Using local, adaptive measurements optimized for the prior distribution and the initial state, we show that Heisenberg scaling is achievable and that the proposed scheme outperforms known schemes in the literature that utilize a similar set of initial states. For an example prior width, we present a detailed comparison and find that is also possible to achieve Heisenberg scaling with a scheme that employs non-adaptive measurements, with the right allocation of copies per GHZ state and single-qubit rotations. We also propose an efficient phase unwinding protocol to extend the dynamic range of the proposed scheme, and show that it outperforms existing protocols by achieving an enhanced precision with a smaller number of additional atoms. Lastly, we discuss the impact of noise and imperfect gates.

Citations (3)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 1 tweet with 1 like about this paper.