Papers
Topics
Authors
Recent
2000 character limit reached

Quantum-Based Self-Attention Mechanism for Hardware-Aware Differentiable Quantum Architecture Search (2512.02476v1)

Published 2 Dec 2025 in quant-ph

Abstract: The automated design of parameterized quantum circuits for variational algorithms in the NISQ era faces a fundamental limitation, as conventional differentiable architecture search relies on classical models that fail to adequately represent quantum gate interactions under hardware noise. We introduce the Quantum-Based Self-Attention for Differentiable Quantum Architecture Search (QBSA-DQAS), a meta-learning framework featuring quantum-based self-attention and hardware-aware multi-objective search. The framework employs a two-stage quantum self-attention module that computes contextual dependencies by mapping architectural parameters through parameterized quantum circuits, replacing classical similarity metrics with quantum-derived attention scores, then applies position-wise quantum transformations for feature enrichment. Architecture search is guided by a task-agnostic multi-objective function jointly optimizing noisy expressibility and Probability of Successful Trials (PST). A post-search optimization stage applies gate commutation, fusion, and elimination to reduce circuit complexity. Experimental validation demonstrates superior performance on VQE tasks and large-scale Wireless Sensor Networks. For VQE on H$_2$, QBSA-DQAS achieves 0.9 accuracy compared to 0.89 for standard DQAS. Post-search optimization reduces discovered circuit complexity by up to 44% in gate count and 47% in depth without accuracy degradation. The framework maintains robust performance across three molecules and five IBM quantum hardware noise models. For WSN routing, discovered circuits achieve 8.6% energy reduction versus QAOA and 40.7% versus classical greedy methods, establishing the effectiveness of quantum-native architecture search for NISQ applications.

Summary

We haven't generated a summary for this paper yet.

Slide Deck Streamline Icon: https://streamlinehq.com

Whiteboard

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.