Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash 90 tok/s
Gemini 2.5 Pro 41 tok/s Pro
GPT-5 Medium 20 tok/s
GPT-5 High 23 tok/s Pro
GPT-4o 93 tok/s
GPT OSS 120B 441 tok/s Pro
Kimi K2 212 tok/s Pro
2000 character limit reached

Quantum Recurrent Embedding Neural Network (2506.13185v1)

Published 16 Jun 2025 in quant-ph

Abstract: Quantum neural networks have emerged as promising quantum machine learning models, leveraging the properties of quantum systems and classical optimization to solve complex problems in physics and beyond. However, previous studies have demonstrated inevitable trainability issues that severely limit their capabilities in the large-scale regime. In this work, we propose a quantum recurrent embedding neural network (QRENN) inspired by fast-track information pathways in ResNet and general quantum circuit architectures in quantum information theory. By employing dynamical Lie algebras, we provide a rigorous proof of the trainability of QRENN circuits, demonstrating that this deep quantum neural network can avoid barren plateaus. Notably, the general QRENN architecture resists classical simulation as it encompasses powerful quantum circuits such as QSP, QSVT, and DQC1, which are widely believed to be classically intractable. Building on this theoretical foundation, we apply our QRENN to accurately classify quantum Hamiltonians and detect symmetry-protected topological phases, demonstrating its applicability in quantum supervised learning. Our results highlight the power of recurrent data embedding in quantum neural networks and the potential for scalable quantum supervised learning in predicting physical properties and solving complex problems.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Ai Generate Text Spark Streamline Icon: https://streamlinehq.com

Paper Prompts

Sign up for free to create and run prompts on this paper using GPT-5.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.