Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
149 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Experimental quantum reservoir computing with a circuit quantum electrodynamics system (2506.22016v1)

Published 27 Jun 2025 in quant-ph

Abstract: Quantum reservoir computing is a machine learning framework that offers ease of training compared to other quantum neural network models, as it does not rely on gradient-based optimization. Learning is performed in a single step on the output features measured from the quantum system. Various implementations of quantum reservoir computing have been explored in simulations, with different measured features. Although simulations have shown that quantum reservoirs present advantages in performance compared to classical reservoirs, experimental implementations have remained scarce. This is due to the challenge of obtaining a large number of output features that are nonlinear transformations of the input data. In this work, we propose and experimentally implement a novel quantum reservoir computing platform based on a circuit quantum electrodynamics architecture, consisting of a single cavity mode coupled to a superconducting qubit. We obtain a large number of nonlinear features from a single physical system by encoding the input data in the amplitude of a coherent drive and measuring the cavity state in the Fock basis. We demonstrate classification of two classical tasks with significantly smaller hardware resources and fewer measured features compared to classical neural networks. Our experimental results are supported by numerical simulations that show additional Kerr nonlinearity is beneficial to reservoir performance. Our work demonstrates a hardware-efficient quantum neural network implementation that can be further scaled up and generalized to other quantum machine learning models.

Summary

We haven't generated a summary for this paper yet.