Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Quantum-Train: Rethinking Hybrid Quantum-Classical Machine Learning in the Model Compression Perspective (2405.11304v2)

Published 18 May 2024 in quant-ph

Abstract: We introduces the Quantum-Train(QT) framework, a novel approach that integrates quantum computing with classical machine learning algorithms to address significant challenges in data encoding, model compression, and inference hardware requirements. Even with a slight decrease in accuracy, QT achieves remarkable results by employing a quantum neural network alongside a classical mapping model, which significantly reduces the parameter count from $M$ to $O(\text{polylog} (M))$ during training. Our experiments demonstrate QT's effectiveness in classification tasks, offering insights into its potential to revolutionize machine learning by leveraging quantum computational advantages. This approach not only improves model efficiency but also reduces generalization errors, showcasing QT's potential across various machine learning applications.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (30)
  1. J. Carrasquilla and R. G. Melko, Machine learning phases of matter, Nature Physics 13, 431 (2017).
  2. S. J. Wetzel, Unsupervised learning of phase transitions: From principal component analysis to variational autoencoders, Physical Review E 96, 022140 (2017).
  3. E. P. Van Nieuwenburg, Y.-H. Liu, and S. D. Huber, Learning phase transitions by confusion, Nature Physics 13, 435 (2017).
  4. E. van Nieuwenburg, E. Bairey, and G. Refael, Learning phase transitions from dynamics, Physical Review B 98, 060301 (2018).
  5. F. Schindler, N. Regnault, and T. Neupert, Probing many-body localization with neural networks, Physical Review B 95, 245134 (2017).
  6. A. Seif, M. Hafezi, and C. Jarzynski, Machine learning the thermodynamic arrow of time, Nature Physics 17, 105 (2021).
  7. E.-J. Kuo and H. Dehghani, Unsupervised learning of interacting topological and symmetry-breaking phase transitions, Phys. Rev. B 105, 235136 (2022).
  8. E.-J. Kuo, Y.-L. L. Fang, and S. Y.-C. Chen, Quantum architecture search via deep reinforcement learning, arXiv preprint arXiv:2104.07715  (2021).
  9. C.-Y. Liu and D.-W. Wang, Random sampling neural network for quantum many-body problems, Phys. Rev. B 103, 205107 (2021).
  10. S.-T. Tsai, E.-J. Kuo, and P. Tiwary, Learning molecular dynamics with simple language model built upon long short-term memory neural network, Nature communications 11, 5115 (2020).
  11. M. Schuld, I. Sinayskiy, and F. Petruccione, An introduction to quantum machine learning, Contemporary Physics 56, 172 (2015).
  12. E. Farhi and H. Neven, Classification with quantum neural networks on near term processors (2018), arXiv:1802.06002 [quant-ph] .
  13. M. Schuld and N. Killoran, Quantum machine learning in feature hilbert spaces, Physical Review Letters 122, 10.1103/physrevlett.122.040504 (2019).
  14. C.-Y. Liu and H.-S. Goan, Reinforcement learning quantum local search, in 2023 IEEE International Conference on Quantum Computing and Engineering (QCE), Vol. 02 (2023) pp. 246–247.
  15. C.-Y. Liu, Practical quantum search by variational quantum eigensolver on noisy intermediate-scale quantum hardware (2023), arXiv:2304.03747 [quant-ph] .
  16. I. Kerenidis and A. Prakash, Quantum recommendation systems (2016), arXiv:1603.08675 .
  17. R. Nembrini, M. Ferrari Dacrema, and P. Cremonesi, Feature selection for recommender systems with quantum computing, Entropy 23, 10.3390/e23080970 (2021).
  18. P.-L. Dallaire-Demers and N. Killoran, Quantum generative adversarial networks, Phys. Rev. A 98, 012324 (2018).
  19. S. Lloyd and C. Weedbrook, Quantum generative adversarial learning, Phys. Rev. Lett. 121, 040502 (2018).
  20. Y. Du, M.-H. Hsieh, and D. Tao, Efficient online quantum generative adversarial learning algorithms with applications (2019), arXiv:1904.09602 .
  21. M. Soltanolkotabi, A. Javanmard, and J. D. Lee, Theoretical insights into the optimization landscape of over-parameterized shallow neural networks, IEEE Transactions on Information Theory 65, 742 (2019).
  22. C.-Y. Liu, C.-H. A. Lin, and K.-C. Chen, Learning quantum phase estimation by variational quantum circuits (2023), arXiv:2311.04690 [quant-ph] .
  23. L. S. de Souza, J. H. A. de Carvalho, and T. A. E. Ferreira, Classical artificial neural network training using quantum walks as a search procedure, IEEE Transactions on Computers 71, 378 (2022).
  24. K. Hornik, M. Stinchcombe, and H. White, Multilayer feedforward networks are universal approximators, Neural Networks 2, 359 (1989).
  25. L. Deng, The mnist database of handwritten digit images for machine learning research, IEEE Signal Processing Magazine 29, 141 (2012).
  26. H. Xiao, K. Rasul, and R. Vollgraf, Fashion-mnist: a novel image dataset for benchmarking machine learning algorithms (2017), arXiv:1708.07747 [cs.LG] .
  27. A. Krizhevsky, V. Nair, and G. Hinton, Cifar-10 (canadian institute for advanced research),  .
  28. J. Alman and Z. Song, The fine-grained complexity of gradient computation for training large language models, arXiv preprint arXiv:2402.04497  (2024).
  29. H. Buhrman, S. Patro, and F. Speelman, The quantum strong exponential-time hypothesis, arXiv preprint arXiv:1911.05686  (2019).
  30. I. J. Good, Rational decisions, Journal of the Royal Statistical Society: Series B (Methodological) 14, 107 (1952), https://rss.onlinelibrary.wiley.com/doi/pdf/10.1111/j.2517-6161.1952.tb00104.x .
Citations (9)

Summary

We haven't generated a summary for this paper yet.