Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 167 tok/s
Gemini 2.5 Pro 47 tok/s Pro
GPT-5 Medium 39 tok/s Pro
GPT-5 High 29 tok/s Pro
GPT-4o 92 tok/s Pro
Kimi K2 188 tok/s Pro
GPT OSS 120B 429 tok/s Pro
Claude Sonnet 4.5 34 tok/s Pro
2000 character limit reached

GSQAS: Graph Self-supervised Quantum Architecture Search (2303.12381v1)

Published 22 Mar 2023 in quant-ph

Abstract: Quantum Architecture Search (QAS) is a promising approach to designing quantum circuits for variational quantum algorithms (VQAs). However, existing QAS algorithms require to evaluate a large number of quantum circuits during the search process, which makes them computationally demanding and limits their applications to large-scale quantum circuits. Recently, predictor-based QAS has been proposed to alleviate this problem by directly estimating the performances of circuits according to their structures with a predictor trained on a set of labeled quantum circuits. However, the predictor is trained by purely supervised learning, which suffers from poor generalization ability when labeled training circuits are scarce. It is very time-consuming to obtain a large number of labeled quantum circuits because the gate parameters of quantum circuits need to be optimized until convergence to obtain their ground-truth performances. To overcome these limitations, we propose GSQAS, a graph self-supervised QAS, which trains a predictor based on self-supervised learning. Specifically, we first pre-train a graph encoder on a large number of unlabeled quantum circuits using a well-designed pretext task in order to generate meaningful representations of circuits. Then the downstream predictor is trained on a small number of quantum circuits' representations and their labels. Once the encoder is trained, it can apply to different downstream tasks. In order to better encode the spatial topology information and avoid the huge dimension of feature vectors for large-scale quantum circuits, we design a scheme to encode quantum circuits as graphs. Simulation results on searching circuit structures for variational quantum eigensolver and quantum state classification show that GSQAS outperforms the state-of-the-art predictor-based QAS, achieving better performance with fewer labeled circuits.

Citations (15)

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube