Papers
Topics
Authors
Recent
2000 character limit reached

On the Sample Complexity of Quantum Boltzmann Machine Learning (2306.14969v4)

Published 26 Jun 2023 in quant-ph and cs.LG

Abstract: Quantum Boltzmann machines (QBMs) are machine-learning models for both classical and quantum data. We give an operational definition of QBM learning in terms of the difference in expectation values between the model and target, taking into account the polynomial size of the data set. By using the relative entropy as a loss function this problem can be solved without encountering barren plateaus. We prove that a solution can be obtained with stochastic gradient descent using at most a polynomial number of Gibbs states. We also prove that pre-training on a subset of the QBM parameters can only lower the sample complexity bounds. In particular, we give pre-training strategies based on mean-field, Gaussian Fermionic, and geometrically local Hamiltonians. We verify these models and our theoretical findings numerically on a quantum and a classical data set. Our results establish that QBMs are promising machine learning models.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (33)
  1. V. Dunjko and H. J. Briegel, Machine learning & artificial intelligence in the quantum domain: a review of recent progress, Reports on Progress in Physics 81, 074001 (2018).
  2. L. Lamata, Quantum machine learning and quantum biomimetics: A perspective, Machine Learning: Science and Technology 1, 033002 (2020).
  3. S. Aaronson, Read the fine print, Nature Physics 11, 291 (2015).
  4. T. Hoefler, T. Häner, and M. Troyer, Disentangling hype from practicality: On realistically achieving quantum advantage, Commun. ACM 66, 82–87 (2023).
  5. C. Ortiz Marrero, M. Kieferová, and N. Wiebe, Entanglement-induced barren plateaus, PRX Quantum 2, 040316 (2021).
  6. E. Cervero Martín, K. Plekhanov, and M. Lubasch, Barren plateaus in quantum tensor network optimization, Quantum 7, 974 (2023).
  7. M. Kieferová and N. Wiebe, Tomography and generative training with quantum boltzmann machines, Phys. Rev. A 96, 062327 (2017).
  8. H. J. Kappen, Learning quantum models from quantum or classical data, Journal of Physics A: Mathematical and Theoretical 53, 214001 (2020).
  9. M. Benedetti, J. Realpe-Gómez, and A. Perdomo-Ortiz, Quantum-assisted helmholtz machines: A quantum–classical deep learning framework for industrial datasets in near-term devices, Quantum Science and Technology 3, 034007 (2018).
  10. L. Wang, Y. Sun, and X. Zhang, Quantum deep transfer learning, New Journal of Physics 23, 103010 (2021).
  11. G. E. Hinton, S. Osindero, and Y.-W. Teh, A Fast Learning Algorithm for Deep Belief Nets, Neural Computation 18, 1527 (2006).
  12. N. Wiebe and L. Wossnig, Generative training of quantum boltzmann machines with hidden units (2019), arXiv:1905.09902 [quant-ph] .
  13. M. Kieferova, O. M. Carlos, and N. Wiebe, Quantum generative training using rényi divergences (2021), arXiv:2106.09567 [quant-ph] .
  14. A. Khaled and P. Richtárik, Better theory for sgd in the nonconvex world (2020), arXiv:2002.03329 [math.OC] .
  15. G. Garrigos and R. M. Gower, Handbook of convergence theorems for (stochastic) gradient methods (2023), arXiv:2301.11235 [math.OC] .
  16. S. Aaronson, Shadow tomography of quantum states (2018), arXiv:1711.01053 [quant-ph] .
  17. H.-Y. Huang, R. Kueng, and J. Preskill, Information-theoretic bounds on quantum advantage in machine learning, Physical Review Letters 126 (2021).
  18. C. Rouzé and D. S. França, Learning quantum many-body systems from a few copies (2023), arXiv:2107.03333 [quant-ph] .
  19. A. N. Chowdhury and R. D. Somma, Quantum algorithms for gibbs sampling and hitting-time estimation (2016), arXiv:1603.02940 [quant-ph] .
  20. D. Zhang, J. L. Bosse, and T. Cubitt, Dissipative quantum gibbs sampling (2023), arXiv:2304.04526 [quant-ph] .
  21. L. Coopmans, Y. Kikuchi, and M. Benedetti, Predicting gibbs-state expectation values with pure thermal shadows, PRX Quantum 4, 010305 (2023).
  22. J. Haah, R. Kothari, and E. Tang, Optimal learning of quantum hamiltonians from high-temperature gibbs states, in 2022 IEEE 63rd Annual Symposium on Foundations of Computer Science (FOCS) (2022) pp. 135–146.
  23. E. T. Jaynes, Information Theory and Statistical Mechanics, Physical Review 106, 620 (1957).
  24. H.-Y. Huang, R. Kueng, and J. Preskill, Predicting many properties of a quantum system from very few measurements, Nature Physics 16, 1050 (2020).
  25. E. R. Anschuetz and B. T. Kiani, Quantum variational algorithms are swamped with traps, Nature Communications 13 (2022).
  26. W. Heisenberg, Zur theorie des ferromagnetismus, Zeitschrift für Physik 49, 619 (1928).
  27. F. Franchini, An Introduction to Integrable Techniques for One-Dimensional Quantum Systems (Springer International Publishing, 2017).
  28. H. Thanh-Tung and T. Tran, Toward a generalization metric for deep generative models (2021), arXiv:2011.00754 [cs.LG] .
  29. K. Gili, M. Mauri, and A. Perdomo-Ortiz, Generalization metrics for practical quantum advantage in generative models (2023), arXiv:2201.08770 [cs.LG] .
  30. M. B. Hastings, Quantum belief propagation: An algorithm for thermal quantum systems, Phys. Rev. B 76, 201102 (2007).
  31. J. Surace and L. Tagliacozzo, Fermionic gaussian states: an introduction to numerical approaches, SciPost Physics Lecture Notes  (2022).
  32. T. Kuwahara, A. M. Alhambra, and A. Anshu, Improved thermal area law and quasilinear time algorithm for quantum gibbs states, Phys. Rev. X 11, 011047 (2021).
  33. A. M. Alhambra and J. I. Cirac, Locally accurate tensor networks for thermal states and time evolution, PRX Quantum 2, 040331 (2021).
Citations (13)

Summary

We haven't generated a summary for this paper yet.

Slide Deck Streamline Icon: https://streamlinehq.com

Whiteboard

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets

Sign up for free to view the 1 tweet with 0 likes about this paper.