Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 60 tok/s
Gemini 2.5 Pro 54 tok/s Pro
GPT-5 Medium 30 tok/s Pro
GPT-5 High 35 tok/s Pro
GPT-4o 99 tok/s Pro
Kimi K2 176 tok/s Pro
GPT OSS 120B 448 tok/s Pro
Claude Sonnet 4.5 37 tok/s Pro
2000 character limit reached

Study of noise in virtual distillation circuits for quantum error mitigation (2210.15317v2)

Published 27 Oct 2022 in quant-ph

Abstract: Virtual distillation has been proposed as an error mitigation protocol for estimating the expectation values of observables in quantum algorithms. It proceeds by creating a cyclic permutation of $M$ noisy copies of a quantum state using a sequence of controlled-swap gates. If the noise does not shift the dominant eigenvector of the density operator away from the ideal state, then the error in expectation-value estimation can be exponentially reduced with $M$. In practice, subsequent error mitigation techniques are required to suppress the effect of noise in the cyclic permutation circuit itself, leading to increased experimental complexity. Here, we perform a careful analysis of the effect of uncorrelated, identical noise in the cyclic permutation circuit and find that the estimation of expectation value of observables are robust against dephasing noise. We support the analytical result with numerical simulations and find that $67\%$ of errors are reduced for $M=2$, with physical dephasing error probabilities as high as $10\%$. Our results imply that a broad class of quantum algorithms can be implemented with higher accuracy in the near-term with qubit platforms where non-dephasing errors are suppressed, such as superconducting bosonic qubits and Rydberg atoms.

Citations (4)

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets

This paper has been mentioned in 1 post and received 0 likes.