Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 60 tok/s
Gemini 2.5 Pro 54 tok/s Pro
GPT-5 Medium 30 tok/s Pro
GPT-5 High 35 tok/s Pro
GPT-4o 99 tok/s Pro
Kimi K2 176 tok/s Pro
GPT OSS 120B 448 tok/s Pro
Claude Sonnet 4.5 37 tok/s Pro
2000 character limit reached

Quantum Circuit Distillation and Compression (2309.01911v1)

Published 5 Sep 2023 in quant-ph

Abstract: Quantum coherence in a qubit is vulnerable to environmental noise. When long quantum calculation is run on a quantum processor without error correction, the noise often causes fatal errors and messes up the calculation. Here, we propose quantum-circuit distillation to generate quantum circuits that are short but have enough functions to produce an output almost identical to that of the original circuits. The distilled circuits are less sensitive to the noise and can complete calculation before the quantum coherence is broken in the qubits. We created a quantum-circuit distillator by building a reinforcement learning model, and applied it to the inverse quantum Fourier transform (IQFT) and Shor's quantum prime factorization. The obtained distilled circuit allows correct calculation on IBM-Quantum processors. By working with the quantum-circuit distillator, we also found a general rule to generate quantum circuits approximating the general $n$-qubit IQFTs. The quantum-circuit distillator offers a new approach to improve performance of noisy quantum processors.

Citations (4)

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets

This paper has been mentioned in 1 post and received 0 likes.