Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
158 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Transfer learning in hybrid classical-quantum neural networks (1912.08278v2)

Published 17 Dec 2019 in quant-ph, cs.LG, and stat.ML

Abstract: We extend the concept of transfer learning, widely applied in modern machine learning algorithms, to the emerging context of hybrid neural networks composed of classical and quantum elements. We propose different implementations of hybrid transfer learning, but we focus mainly on the paradigm in which a pre-trained classical network is modified and augmented by a final variational quantum circuit. This approach is particularly attractive in the current era of intermediate-scale quantum technology since it allows to optimally pre-process high dimensional data (e.g., images) with any state-of-the-art classical network and to embed a select set of highly informative features into a quantum processor. We present several proof-of-concept examples of the convenient application of quantum transfer learning for image recognition and quantum state classification. We use the cross-platform software library PennyLane to experimentally test a high-resolution image classifier with two different quantum computers, respectively provided by IBM and Rigetti.

Citations (252)

Summary

  • The paper extends the concept of transfer learning to hybrid classical-quantum neural networks, proposing a framework suitable for the Noisy Intermediate-Scale Quantum (NISQ) era.
  • It categorizes hybrid transfer learning into four paradigms: Classical to Classical (CC), Classical to Quantum (CQ), Quantum to Classical (QC), and Quantum to Quantum (QQ), highlighting the practical potential of CQ for tasks like image classification.
  • The research offers a pragmatic approach for leveraging current quantum resources by combining them with classical machine learning, opening new avenues for quantum machine learning applications and theoretical understanding.

Transfer Learning in Hybrid Classical-Quantum Neural Networks

In the paper of machine learning, transfer learning has established itself as a key methodology allowing the reuse of pre-trained models across different but related tasks, leading to reduced computational burden and improved performance. The paper "Transfer learning in hybrid classical-quantum neural networks" by Mari et al. extends this paradigm to the hybrid domain comprising both classical and quantum computing elements, offering a novel approach suitable for the current landscape of quantum computing.

Overview of Hybrid Classical-Quantum Neural Networks

Hybrid neural networks leverage the strengths of both classical and quantum computation. A typical hybrid model implies combining classical neural networks (C-NNs) and variational quantum circuits (VQCs). While classical networks are adept at handling large-scale data analysis and feature extraction, quantum components can potentially encode complex quantum data and exploit quantum parallelism.

In a hybrid neural network, a classical neural network is often used to pre-process the input data into intermediary features, and these features are then fed into a quantum network which can leverage quantum features that might be intractable for classical computers alone. The paper positions these hybrid architectures as particularly apt for the Noisy Intermediate-Scale Quantum (NISQ) era, where quantum hardware remains limited in qubit numbers and coherence times.

Transfer Learning: Classical to Quantum, and Quantum Variants

The research focuses on transfer learning within the expanse of these hybrid systems. Transfer learning approaches in this framework are categorized into four distinct paradigms:

  1. Classical to Classical (CC): Direct application of well-understood classical transfer learning principles where a pre-trained classical model is partially retrained for a new task.
  2. Classical to Quantum (CQ): In this paradigm, a pre-trained classical network is used to process data and extract features, which are subsequently utilized by a quantum network to solve a more specific task. This approach is highly beneficial for high-dimensional data, such as images, where a quantum component handles only the reduced set of features. The paper's experimental results, including running high-resolution image classifications on IBM and Rigetti QPUs, underscore the feasibility and potential of this method.
  3. Quantum to Classical (QC): Features extracted by a pre-trained quantum model are post-processed by a classical neural network. This model is particularly promising when the system must classify or handle quantum information in a classical context.
  4. Quantum to Quantum (QQ): Transfer of learned representations between quantum networks. Recognizing intermediate features produced by quantum layers can be crucial for speeding up training processes and efficiently utilizing quantum resources.

Implications and Future Directions

The conceptualization and realization of hybrid transfer learning models not only demonstrate a promising merger of classical and quantum methods but also carry substantial implications for the future of quantum computing. They outline a pragmatic path forward, leveraging existing quantum resources while addressing their limitations through classical machine learning techniques.

From a theoretical perspective, these models open new avenues in understanding quantum feature spaces and their interactions with classical domains, possibly advancing paradigms such as quantum feature maps and embeddings. Practically, they pave the way towards commercial and operational quantum machine learning applications in areas like large-scale data classification, optimization, and more.

The paper concludes by highlighting the significant opportunities for further exploration of transfer learning in quantum contexts. Future work could explore optimizing these hybrid architectures for more specific tasks, improving quantum robustness and efficiency, and harnessing quantum entanglement and superposition to extract features that are classically inaccessible.

In summary, the extension of transfer learning into hybrid classical-quantum domains provides a robust toolset for the era of NISQ devices, capitalizing on the strengths of each computation paradigm and promoting the joint evolution of classical and quantum algorithms.