Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
158 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Temporal Information Processing on Noisy Quantum Computers (2001.09498v2)

Published 26 Jan 2020 in quant-ph, cs.SY, eess.SY, and stat.ML

Abstract: The combination of machine learning and quantum computing has emerged as a promising approach for addressing previously untenable problems. Reservoir computing is an efficient learning paradigm that utilizes nonlinear dynamical systems for temporal information processing, i.e., processing of input sequences to produce output sequences. Here we propose quantum reservoir computing that harnesses complex dissipative quantum dynamics. Our class of quantum reservoirs is universal, in that any nonlinear fading memory map can be approximated arbitrarily closely and uniformly over all inputs by a quantum reservoir from this class. We describe a subclass of the universal class that is readily implementable using quantum gates native to current noisy gate-model quantum computers. Proof-of-principle experiments on remotely accessed cloud-based superconducting quantum computers demonstrate that small and noisy quantum reservoirs can tackle high-order nonlinear temporal tasks. Our theoretical and experimental results pave the path for attractive temporal processing applications of near-term gate-model quantum computers of increasing fidelity but without quantum error correction, signifying the potential of these devices for wider applications including neural modeling, speech recognition and natural language processing, going beyond static classification and regression tasks.

Citations (72)

Summary

We haven't generated a summary for this paper yet.