Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 71 tok/s
Gemini 2.5 Pro 46 tok/s Pro
GPT-5 Medium 27 tok/s Pro
GPT-5 High 30 tok/s Pro
GPT-4o 93 tok/s Pro
Kimi K2 207 tok/s Pro
GPT OSS 120B 460 tok/s Pro
Claude Sonnet 4.5 36 tok/s Pro
2000 character limit reached

An Amplitude-Encoding-Based Classical-Quantum Transfer Learning framework: Outperforming Classical Methods in Image Recognition (2502.20184v1)

Published 27 Feb 2025 in quant-ph

Abstract: The classical-quantum transfer learning (CQTL) method is introduced to address the challenge of training large-scale, high-resolution image data on a limited number of qubits (ranging from tens to hundreds) in the current Noisy Intermediate-Scale quantum (NISQ) era. existing CQTL frameworks have been demonstrate quantum advantages with a small number of parameters (around 50), but the performance of quantum neural networks is sensitive to the number of parameters. Currently, there is a lack of exploration into larger-scale quantum circuits with more parameters. This paper proposes an amplitude-encoding-based classical-quantum transfer learning (AE-CQTL) framework, accompanied by an effective learning algorithm. The AE-CQTL framework multiplies the parameters of quantum circuits by using multi-layer ansatz. Based on the AE-CQTL framework, we designed and implemented two CQTL neural network models: Transfer learning Quantum Neural Network (TLQNN) and Transfer Learning Quantum Convolutional Neural Network (TLQCNN). Both models significantly expand the parameter capacity of quantum circuits, elevating the parameter scale from a few dozen to over one hundred parameters. In cross-experiments with three benchmark datasets (MNIST, Fashion-MNIST and CIFAR10) and three source models (ResNet18, ResNet50 and DenseNet121), TLQNN and TLQCNN have exceeded the benchmark classical classifier in multiple performance metrics, including accuracy, convergence, stability, and generalization capability. Our work contributes to advancing the application of classical-quantum transfer learning on larger-scale quantum devices in future.

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.