Papers
Topics
Authors
Recent
Search
2000 character limit reached

Transpiling quantum circuits by a transformers-based algorithm

Published 10 Dec 2025 in quant-ph | (2512.09834v1)

Abstract: Transformers have gained popularity in machine learning due to their application in the field of natural language processing. They manipulate and process text efficiently, capturing long-range dependencies among data and performing the next word prediction. On the other hand, gate-based quantum computing is based on controlling the register of qubits in the quantum hardware by applying a sequence of gates, a process which can be interpreted as a low level text programming language. We develop a transformer model capable of transpiling quantum circuits from the qasm standard to other sets of gates native suited for a specific target quantum hardware, in our case the set for the trapped-ion quantum computers of IonQ. The feasibility of a translation up to five qubits is demonstrated with a percentage of correctly transpiled target circuits equal or superior to 99.98%. Regardless the depth of the register and the number of gates applied, we prove that the complexity of the transformer model scales, in the worst case scenario, with a polynomial trend by increasing the depth of the register and the length of the circuit, allowing models with a higher number of parameters to be efficiently trained on HPC infrastructures.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.