Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 155 tok/s
Gemini 2.5 Pro 42 tok/s Pro
GPT-5 Medium 34 tok/s Pro
GPT-5 High 31 tok/s Pro
GPT-4o 101 tok/s Pro
Kimi K2 213 tok/s Pro
GPT OSS 120B 422 tok/s Pro
Claude Sonnet 4.5 36 tok/s Pro
2000 character limit reached

Quantum Architecture Search

Updated 21 September 2025
  • Quantum Architecture Search (QAS) is the automated discovery of parameterized quantum circuits that balance expressivity and hardware constraints for optimal quantum algorithm performance.
  • It integrates methodologies such as gradient-based optimization, supernet techniques, neural predictors, and evolutionary as well as reinforcement learning to efficiently navigate complex circuit design spaces.
  • QAS has diverse applications in quantum machine learning, chemistry, and combinatorial optimization, significantly reducing manual design effort while enhancing noise resilience on NISQ devices.

Quantum Architecture Search (QAS) refers to the automated discovery of parameterized quantum circuit architectures tailored for specific computational tasks, particularly in the context of variational quantum algorithms (VQAs) on noisy intermediate-scale quantum (NISQ) devices. Efficient QAS is indispensable because the expressivity, trainability, and hardware compatibility of a quantum circuit are highly sensitive to its architecture, impacting performance in quantum machine learning, simulation, optimization, and quantum chemistry. Manual QAS is labor-intensive and error-prone; automated QAS seeks to optimize both circuit structure and gate parameters, often employing advanced machine learning, gradient-based optimization, or evolutionary strategies. This article surveys the theoretical foundations, methodological strategies, applications, and current research challenges associated with QAS, integrating insights from recent research advances.

1. Theoretical Foundations and Motivation

QAS is fundamentally motivated by the observation that the architecture of a parameterized quantum circuit—its connectivity, gate arrangement, and overall topology—strongly governs both its expressivity (the set of quantum states it can efficiently generate) and trainability (avoidance of barren plateaus and optimization feasibility) (Zhang et al., 2020, Du et al., 2020, Su et al., 20 Feb 2025). NISQ hardware restrictions, such as limited connectivity and prevalent noise, further impose stringent constraints and make hand-crafting effective ansätze both technically demanding and often suboptimal (Linghu et al., 2022, Kölle et al., 14 Sep 2025). QAS aims to overcome these challenges by automating architecture discovery, thus accelerating the identification of circuits exhibiting quantum or quantum-inspired advantage on resource-limited platforms.

Automated QAS seeks to:

2. Principal QAS Methodologies

Multiple methodological paradigms have emerged for QAS, each leveraging distinct algorithmic strategies:

Differentiable (Gradient-Based) Approaches

Differentiable Quantum Architecture Search (DQAS) relaxes the discrete search over circuit structures into a continuous, differentiable domain, enabling joint optimization of gate choices (structure) and parameters. The discrete circuit layout at each layer is encoded probabilistically with trainable parameters α\alpha, using softmax distributions over an operation pool. The full architecture search problem is then formulated as bi-optimization over structure (α\alpha) and gate parameters (θ\theta), both updated with gradient techniques including REINFORCE estimators and parameter-shift rules:

  • Objective loss (averaged over architectures kk sampled from P(k,α)P(k,\alpha)):

L=kP(k,α)L(U(k,θ))\mathcal{L} = \sum_{k \sim P(k,\alpha)} L(U(k, \theta))

  • Structural gradient (score function estimator):

αL=kP(k,α)αlogP(k,α)L(U(k,θ))\nabla_{\alpha} \mathcal{L} = \sum_{k \sim P(k, \alpha)} \nabla_{\alpha} \log P(k, \alpha) \cdot L(U(k, \theta))

DQAS enables automated circuit design for unitary decomposition, noise-mitigated circuit completion, and QAOA ansatz search, showing significant gains in both circuit compactness and fidelity relative to manual designs (Zhang et al., 2020).

Supernet and Weight Sharing Techniques

QAS frameworks such as those in (Du et al., 2020, Linghu et al., 2022) employ a "supernet" that encodes a large ansatz pool. Weight sharing is enforced so that different architectures re-use parameters when layer/gate layouts coincide, reducing the parameter optimization burden. The supernet simultaneously supports the fast sampling, evaluation, and ranking of candidate circuits. Ranking can be subsequently enhanced using evolutionary processes (e.g., NSGA-II) or heuristic bandit approaches.

Predictor-based QAS leverages neural networks (CNNs, RNNs, GNNs) to estimate the performance of candidate architectures from their circuit structural encodings (e.g., tensor representations or DAGs), thus pruning unpromising candidates without exhaustive quantum-classical simulation. Predictors are trained either in a supervised way (requiring labeled, fully trained circuits) (Zhang et al., 2021), or via self-supervised and unsupervised representation learning schemes (He et al., 2023, Sun et al., 21 Jan 2024). This predictor-driven filtering results in a drastic reduction in the number of costly circuit evaluations (e.g., order-of-magnitude speedups) and can be enhanced by graph encoding schemes that better preserve circuit topology (He et al., 2023).

Evolutionary Algorithms

Evolutionary-based QAS, such as EQAS (Zhang et al., 2022) and QuProFS (Gujju et al., 9 Aug 2025), encodes each quantum circuit architecture as a genome (binary or integer string) representing gates, positions, and inclusions. Evolutionary processes combine redundancy removal (using QFIM eigenanalysis), fitness-based selection (often integrating accuracy and simplicity), and stochastic genetic operators (mutation, crossover). Proxy metrics such as expressivity (KL divergence to Haar), local effective dimension for trainability, kernel alignment, and hardware robustness guide selection and aggregation, allowing for rapid, training-free evolution of robust circuits.

Reinforcement Learning and Curriculum RL

Reinforcement learning (RL)–based QAS treats circuit construction as a Markov Decision Process, where an agent sequentially selects gates to optimize a reward balancing task performance and circuit complexity (Kölle et al., 14 Sep 2025, Patel et al., 5 Feb 2024, Sadhu et al., 9 Apr 2024, Kundu et al., 25 Jun 2024, Dutta et al., 16 Jul 2025). Recently, curriculum RL has enabled more efficient exploration by progressively increasing task difficulty and circuit complexity. Specific frameworks combine RL with function approximation via either classical MLPs or Kolmogorov-Arnold Networks (KAN) (Kundu et al., 25 Jun 2024), or leverage quantum-enhanced RL agents operating with quantum tensor network representations (Dutta et al., 16 Jul 2025). Reward functions are domain-adapted, incorporating energetic, structural, or information-theoretic metrics. Realizations include double deep Q-networks, prioritized experience replay, and explicit curriculum regimes for staged task escalation.

Recent results reveal that circuit topology frequently dominates over gate type choices in determining circuit performance (Su et al., 20 Feb 2025). The Topology-Driven QAS (TD-QAS) framework decouples the search over topology ("placeholders" expressing circuit connectivity) from the search over gate types ("gate-tuning"), enabling efficient parameter inheritance and dramatic reduction of the architecture search space.

Training-free QAS approaches avoid full parameter optimization or predictive model training entirely. For example, landscape fluctuation analysis quantifies the learnability of a circuit by computing normalized cost function fluctuations over parameter samplings, serving as a predictor of trainability without any circuit optimization (Zhu et al., 8 May 2025). Other methods employ surrogate models (e.g., GNNs, RFs)—trained on massive datasets of circuit–performance pairs—to support rapid architecture evaluation and benchmarking without incurring the underlying resource cost (Martyniuk et al., 7 Jun 2025).

3. Application Domains and Empirical Benchmarks

QAS methods have been empirically validated across a range of NISQ-oriented quantum tasks, including:

Key performance metrics across these domains include energy accuracy (E/E0E/E_0), classification accuracy, circuit complexity (gate count and depth), robustness to realistic hardware noise, and search runtime or sampling efficiency.

4. Architectural Encoding and Search Space Complexity

Effective QAS depends critically on representing the circuit search space both expressively and efficiently. Encodings in use include:

Scaling QAS beyond moderate qubit numbers remains a central concern. Recent advances include two-level search strategies (layerwise followed by pruning), dynamic screening (e.g., coarse-to-fine untrained filtering and trained focusing (Yu et al., 31 Oct 2024)), and use of classically tractable predictors that scale polynomially with system size (Zhu et al., 8 May 2025). Training-free, criterion-driven, or topology-first approaches show particular promise for large qubit-scale applications.

5. Challenges and Prospects

Several outstanding challenges and ongoing directions are identified across the QAS literature:

  • Search Space Explosion: Even with encoding optimizations and parameter sharing, the design space grows exponentially with qubit count and circuit depth. Decoupled or hierarchical search, surrogate-assisted filtering, and dynamic evolutionary strategies can mitigate—but not eliminate—scaling.
  • Barren Plateaus and Trainability: Ensuring that ansätze lie in trainable regions of parameter space is critical. Methods leveraging information-content metrics or landscape fluctuation analyses provide efficient, training-free mechanisms for early detection and avoidance of barren plateaus (Soloviev et al., 29 Jul 2024, Zhu et al., 8 May 2025).
  • Hardware Adaptivity: Adapting to hardware topologies, gate error rates, and connectivity remains essential. Incorporating native gate sets, local connectivity, and noise-awareness into QAS frameworks is now standard practice (Linghu et al., 2022, Gujju et al., 9 Aug 2025).
  • Benchmarking and Reproducibility: The lack of standardized benchmarks has hampered direct comparison across QAS methods. Recent works propose surrogate datasets and code frameworks (such as SQuASH) enabling reproducible, fair benchmarking and rapid prototyping (Martyniuk et al., 7 Jun 2025).
  • Transferability and Meta-Learning: Making QAS solutions transferable across tasks and exploiting meta-learned architectures can accelerate convergence and improve overall efficiency (He et al., 2021, He et al., 2023, Sun et al., 21 Jan 2024).

Future developments are likely to include tighter human-in-the-loop and closed-loop optimization, dynamic feedback from hardware runs, integration of more advanced meta-learning, and standardization of QAS benchmarks akin to classical neural architecture search.

6. Interdisciplinary Inspirations and Theoretical Impact

QAS methodologies draw from an interdisciplinary fusion of differentiable programming (continuous relaxations and gradient-based search), probabilistic programming (sampling and stochastic optimization), reinforcement learning (policy optimization in discrete/continuous action spaces), evolutionary computation (population-based search), and quantum information theory (using entanglement and Fisher information as metrics of expressibility and trainability).

The theoretical significance of QAS extends to understanding the complex interplay between circuit topology, gate selection, and noise, and has led to insights generalizable to quantum algorithm design, meta-circuit transfer, and the role of structure in quantum machine learning architectures (Zhang et al., 2020, Martyniuk et al., 10 Jun 2024, Su et al., 20 Feb 2025).

7. Future Directions and Outlook

QAS continues to evolve, with several frontiers detailed in the recent survey and research landscape (Martyniuk et al., 10 Jun 2024, Yu et al., 31 Oct 2024, Su et al., 20 Feb 2025, Zhu et al., 8 May 2025):

  • Distributed and Parallel QAS: Efficient search over distributed quantum processing units for scalable, large-scale circuit discovery.
  • Hardware-in-the-Loop and Online QAS: Integration of live hardware feedback, noise models, and adaptive search tailored to evolving device characteristics.
  • Benchmark Development: Comprehensive tabular and surrogate QAS benchmarks, standardized datasets for circuit performance, and code releases supporting open evaluation (Martyniuk et al., 7 Jun 2025).
  • Multi-Objective and Task-Driven QAS: Simultaneous optimization for noise resilience, circuit compactness, expressibility, and real task performance, beyond single-task settings.
  • Advances in Training-Free and Proxy-Guided Methods: Further development of landscape-driven and hardware-aware, proxy-based metrics for circuit quality and trainability to accelerate search in large, complex spaces (Zhu et al., 8 May 2025, Gujju et al., 9 Aug 2025).
  • Meta-Learning and Knowledge Transfer: Architecture and parameter initializations that generalize across problem instances for rapid adaptation (He et al., 2021, Su et al., 20 Feb 2025).

QAS is poised to become central to the realization of quantum advantage in near-term devices, both by systematically tailoring circuits to task and hardware, and by providing a bridge between theoretical properties, algorithm design, and experimental implementation.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (20)
Forward Email Streamline Icon: https://streamlinehq.com

Follow Topic

Get notified by email when new papers are published related to Quantum Architecture Search (QAS).