Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
88 tokens/sec
Gemini 2.5 Pro Premium
40 tokens/sec
GPT-5 Medium
20 tokens/sec
GPT-5 High Premium
26 tokens/sec
GPT-4o
90 tokens/sec
DeepSeek R1 via Azure Premium
73 tokens/sec
GPT OSS 120B via Groq Premium
485 tokens/sec
Kimi K2 via Groq Premium
197 tokens/sec
2000 character limit reached

Quantum Circuit Transformation: A Monte Carlo Tree Search Framework (2008.09331v4)

Published 21 Aug 2020 in quant-ph

Abstract: In Noisy Intermediate-Scale Quantum (NISQ) era, quantum processing units (QPUs) suffer from, among others, highly limited connectivity between physical qubits. To make a quantum circuit effectively executable, a circuit transformation process is necessary to transform it, with overhead cost the smaller the better, into a functionally equivalent one so that the connectivity constraints imposed by the QPU are satisfied. While several algorithms have been proposed for this goal, the overhead costs are often very high, which degenerates the fidelity of the obtained circuits sharply. One major reason for this lies in that, due to the high branching factor and vast search space, almost all these algorithms only search very shallowly and thus, very often, only (at most) locally optimal solutions can be reached. In this paper, we propose a Monte Carlo Tree Search (MCTS) framework to tackle the circuit transformation problem, which enables the search process to go much deeper. The general framework supports implementations aiming to reduce either the size or depth of the output circuit through introducing SWAP or remote CNOT gates. The algorithms, called MCTS-Size and MCTS-Depth, are polynomial in all relevant parameters. Empirical results on extensive realistic circuits and IBM Q Tokyo show that the MCTS-based algorithms can reduce the size (depth, resp.) overhead by, on average, 66% (84%, resp.) when compared with tket, an industrial level compiler.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.