Papers
Topics
Authors
Recent
2000 character limit reached

A Survey of Quantum Transformers: Architectures, Challenges and Outlooks (2504.03192v4)

Published 4 Apr 2025 in quant-ph

Abstract: Quantum Transformers integrate the representational power of classical Transformers with the computational advantages of quantum computing. Since 2022, research in this area has rapidly expanded, giving rise to diverse technical paradigms and early applications. To address the growing need for consolidation, this paper presents the first comprehensive, systematic, and in-depth survey of quantum Transformer models. First, we delineate the research scope, focusing on improving Transformer parts with quantum methods, and introduce foundational concepts in classical Transformers and quantum machine learning. Then we organize existing studies into two main paradigms: PQC-based and QLA-based, with PQC-based paradigm further divided into QKV-only Quantum Mapping, Quantum Pairwise Attention, Quantum Holistic Attention. and Quantum-Assisted Optimization, analyzing their core mechanisms and architectural traits. We also summarize empirical results that demonstrate preliminary quantum advantages, especially on small-scale tasks or resource-constrained settings. Following this, we examine key technical challenges, such as complexity-resource trade-offs, scalability and generalization limitations, and trainability issues including barren plateaus, and provide potential solutions, including quantumizing classical transformer variants with lower complexity, hybrid designs, and improved optimization strategies. Finally, we propose several promising future directions, e.g., scaling quantum modules into large architectures, applying quantum Transformers to domains with inherently quantum data (e.g., physics, chemistry), and developing theory-driven designs grounded in quantum information science. This survey will help researchers and practitioners quickly grasp the overall landscape of current quantum Transformer research and promote future developments in this emerging field.

Summary

We haven't generated a summary for this paper yet.

Whiteboard

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 6 tweets with 137 likes about this paper.