Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Predicting Ground State Properties: Constant Sample Complexity and Deep Learning Algorithms (2405.18489v2)

Published 28 May 2024 in quant-ph, cs.LG, and physics.comp-ph

Abstract: A fundamental problem in quantum many-body physics is that of finding ground states of local Hamiltonians. A number of recent works gave provably efficient ML algorithms for learning ground states. Specifically, [Huang et al. Science 2022], introduced an approach for learning properties of the ground state of an $n$-qubit gapped local Hamiltonian $H$ from only $n{\mathcal{O}(1)}$ data points sampled from Hamiltonians in the same phase of matter. This was subsequently improved by [Lewis et al. Nature Communications 2024], to $\mathcal{O}(\log n)$ samples when the geometry of the $n$-qubit system is known. In this work, we introduce two approaches that achieve a constant sample complexity, independent of system size $n$, for learning ground state properties. Our first algorithm consists of a simple modification of the ML model used by Lewis et al. and applies to a property of interest known beforehand. Our second algorithm, which applies even if a description of the property is not known, is a deep neural network model. While empirical results showing the performance of neural networks have been demonstrated, to our knowledge, this is the first rigorous sample complexity bound on a neural network model for predicting ground state properties. We also perform numerical experiments that confirm the improved scaling of our approach compared to earlier results.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (97)
  1. Provably efficient machine learning for quantum many-body problems. Science, 377(6613):eabk3333, 2022.
  2. Improved machine learning algorithm for predicting ground state properties. Nature Communications, 15(1):895, 2024.
  3. P. Hohenberg and W. Kohn. Inhomogeneous electron gas. Phys. Rev., 136:B864–B871, 1964.
  4. W. Kohn. Nobel lecture: Electronic structure of matter—wave functions and density functionals. Rev. Mod. Phys., 71:1253–1266, 1999.
  5. Anders W. Sandvik. Stochastic series expansion method with operator-loop update. Phys. Rev. B, 59:R14157–R14160, 1999.
  6. Quantum Monte Carlo. Science, 231(4738):555–560, 1986.
  7. Quantum Monte Carlo Approaches for Correlated Systems. Cambridge University Press, 2017.
  8. Quantum Monte Carlo Methods. Cambridge University Press, 2016.
  9. Steven R White. Density matrix formulation for quantum renormalization groups. Physical review letters, 69(19):2863, 1992.
  10. Steven R White. Density-matrix algorithms for quantum renormalization groups. Phys. Rev. B, 48(14):10345, 1993.
  11. Guifré Vidal. Class of quantum many-body states that can be efficiently simulated. Physical review letters, 101(11):110501, 2008.
  12. A variational eigenvalue solver on a photonic quantum processor. Nat. Commun., 5:4213, 2014.
  13. Matrix product states and projected entangled pair states: Concepts, symmetries, theorems. Reviews of Modern Physics, 93(4):045003, 2021.
  14. Toby S Cubitt. Dissipative ground state preparation and the dissipative quantum eigensolver. arXiv preprint arXiv:2303.11962, 2023.
  15. Machine learning and the physical sciences. Rev. Mod. Phys., 91:045002, 2019.
  16. Juan Carrasquilla. Machine learning for quantum matter. Adv. Phys.: X, 5(1):1797528, 2020.
  17. Machine learning topological states. Phys. Rev. B, 96:195145, 2017.
  18. Machine learning phases of matter. Nat. Phys., 13:431, 2017.
  19. Solving the quantum many-body problem with artificial neural networks. Science, 355(6325):602–606, 2017.
  20. Learning thermodynamics with Boltzmann machines. Physical Review B, 94(16):165134, 2016.
  21. Restricted boltzmann machine learning for solving strongly correlated quantum systems. Phys. Rev. B, 96:205152, 2017.
  22. Learning phase transitions by confusion. Nat. Phys., 13:435, 2017.
  23. Lei Wang. Discovering phase transitions with unsupervised learning. Phys. Rev. B, 94:195105, 2016.
  24. Neural message passing for quantum chemistry. arXiv preprint arXiv:1704.01212, 2017.
  25. Neural-network quantum state tomography. Nat. Phys., 14(5):447–450, 2018.
  26. Extrapolating quantum observables with machine learning: inferring multiple phase transitions from properties of a single phase. Physical review letters, 121(25):255702, 2018.
  27. Unifying machine learning and quantum chemistry with a deep neural network for molecular wavefunctions. Nat. Commun., 10(1):1–10, 2019.
  28. Neural-network quantum states, string-bond states, and chiral topological states. Phys. Rev. X, 8:011006, 2018.
  29. Out-of-distribution generalization for learning quantum dynamics. arXiv preprint arXiv:2204.10268, 2022.
  30. Identifying topological order through unsupervised machine learning. Nat. Phys., 15(8):790–795, 2019.
  31. Orbnet: Deep learning for quantum chemistry using symmetry-adapted atomic-orbital features. J. Chem. Phys., 153(12):124111, 2020.
  32. Fermionic neural-network states for ab-initio electronic structure. Nat. Commun., 11(1):2368, May 2020.
  33. Predicting excited states from ground state wavefunction by supervised quantum machine learning. Machine Learning: Science and Technology, 1(4):045027, 2020.
  34. Deep learning the hohenberg-kohn maps of density functional theory. Physical Review Letters, 125(7):076402, 2020.
  35. Unsupervised mapping of phase diagrams of 2d systems from infinite projected entangled-pair states via deep anomaly detection. SciPost Physics, 11(2):025, 2021.
  36. Predicting properties of quantum systems with conditional generative models. arXiv preprint arXiv:2211.16943, 2022.
  37. Using shadows to learn ground state properties of quantum hamiltonians. Machine Learning and Physical Sciences Workshop at the 36th Conference on Neural Information Processing Systems (NeurIPS), 2022.
  38. Deep learning and the schrödinger equation. Phys. Rev. A, 96:042113, Oct 2017.
  39. Scalable neural networks for the efficient learning of disordered quantum systems. Physical Review E, 102(3):033301, 2020.
  40. Machine learning diffusion monte carlo forces. The Journal of Physical Chemistry A, 127(1):339–355, 2022.
  41. Fast and accurate modeling of molecular atomization energies with machine learning. Physical review letters, 108(5):058301, 2012.
  42. Prediction errors of molecular machine learning models lower than hybrid dft error. Journal of chemical theory and computation, 13(11):5255–5264, 2017.
  43. Identifying quantum phase transitions using artificial neural networks on experimental data. Nature Physics, 15(9):917–920, 2019.
  44. Machine learning of quantum phase transitions. Physical Review B, 99(12):121104, 2019.
  45. Quantum machine learning. Nature, 549(7671):195–202, 2017.
  46. On the sample complexity of quantum boltzmann machine learning. arXiv preprint arXiv:2306.14969, 2023.
  47. The complexity of the local hamiltonian problem. Siam journal on computing, 35(5):1070–1097, 2006.
  48. Efficient learning of lattice quantum systems and phases of matter. arXiv preprint arXiv:2301.12946, 2023.
  49. Provably efficient learning of phases of matter via dissipative evolutions. arXiv preprint arXiv:2311.07506, 2023.
  50. Classification of phases for mixed states via fast dissipative evolution. Quantum, 3:174, 2019.
  51. Exponentially improved efficient machine learning for quantum many-body states with provable guarantees. arXiv preprint arXiv:2304.04353, 2023.
  52. Predicting many properties of a quantum system from very few measurements. Nat. Phys., 16:1050––1057, 2020.
  53. Linear inversion of band-limited reflection seismograms. SIAM Journal on Scientific and Statistical Computing, 7(4):1307–1330, 1986.
  54. Robert Tibshirani. Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society: Series B (Methodological), 58(1):267–288, 1996.
  55. Foundations of machine learning. The MIT Press, 2018.
  56. Ridge regression learning algorithm in dual variables. In Proceedings of the Fifteenth International Conference on Machine Learning, pages 515–521, 1998.
  57. On the approximation of functions by tanh neural networks. Neural Networks, 143:732–750, November 2021.
  58. Enhancing accuracy of deep learning algorithms by training with low-discrepancy sequences. SIAM Journal on Numerical Analysis, 59(3):1811–1834, 2021.
  59. Stanislaw K Zaremba. The mathematical basis of monte carlo and quasi-monte carlo methods. SIAM review, 10(3):303–314, 1968.
  60. Russel E Caflisch. Monte carlo and quasi-monte carlo methods. Acta numerica, 7:1–49, 1998.
  61. Harald Niederreiter. Random number generation and quasi-Monte Carlo methods. SIAM, 1992.
  62. Recent advances in randomized quasi-monte carlo methods. Modeling uncertainty: An examination of stochastic theory, methods, and applications, pages 419–474, 2002.
  63. Automorphic equivalence within gapped phases of quantum lattice systems. Commun. Math. Phys., 309(3):835–871, 2012.
  64. Quasiadiabatic continuation of quantum states: The stability of topological ground-state degeneracy and emergent gauge invariance. Phys. Rev. B, 72(4):045141, 2005.
  65. Tobias J Osborne. Simulating adiabatic evolution of gapped spin systems. Phys. Rev. A, 75(3):032321, 2007.
  66. Learning to predict arbitrary quantum processes. PRX Quantum, 4(4):040337, 2023.
  67. Mixed-state entanglement from local randomized measurements. Phys. Rev. Lett., 125:200501, 2020.
  68. The randomized measurement toolbox. arXiv preprint arXiv:2203.11374, 2022.
  69. Matchgate shadows for fermionic quantum simulation. arXiv preprint arXiv:2207.13723, 2022.
  70. Classical shadows with pauli-invariant unitary ensembles. arXiv preprint arXiv:2202.03272, 2022.
  71. Understanding the difficulty of training deep feedforward neural networks. In Proceedings of the thirteenth international conference on artificial intelligence and statistics, pages 249–256. JMLR Workshop and Conference Proceedings, 2010.
  72. Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980, 2014.
  73. Adam: A method for stochastic optimization, 2017.
  74. Gradient descent finds global minima of deep neural networks. In International conference on machine learning, pages 1675–1685. PMLR, 2019.
  75. Ilya M Sobol. Uniformly distributed sequences with an additional uniform property. USSR Computational mathematics and mathematical physics, 16(5):236–242, 1976.
  76. Il’ya Meerovich Sobol’. On the distribution of points in a cube and the approximate evaluation of integrals. Zhurnal Vychislitel’noi Matematiki i Matematicheskoi Fiziki, 7(4):784–802, 1967.
  77. Harald Niederreiter. Point sets and sequences with small discrepancy. Monatshefte für Mathematik, 104:273–337, 1987.
  78. Harald Niederreiter. Low-discrepancy and low-dispersion sequences. Journal of number theory, 30(1):51–70, 1988.
  79. John H Halton. On the efficiency of certain quasi-random sequences of points in evaluating multi-dimensional integrals. Numerische Mathematik, 2:84–90, 1960.
  80. Art B Owen. Monte carlo variance of scrambled net quadrature. SIAM Journal on Numerical Analysis, 34(5):1884–1910, 1997.
  81. Probabilistic discrepancy bound for monte carlo point sets, 2012.
  82. Decoupled weight decay regularization, 2019.
  83. Support-vector networks. Mach. Learn., 20(3):273–297, 1995.
  84. Gradient descent provably optimizes over-parameterized neural networks, 2019.
  85. Deep learning observables in computational fluid dynamics. Journal of Computational Physics, 410:109339, 2020.
  86. Understanding machine learning: From theory to algorithms. Cambridge university press, 2014.
  87. Art B Owen. Multidimensional variation for quasi-monte carlo. In Contemporary Multivariate Analysis And Design Of Experiments: In Celebration of Professor Kai-Tai Fang’s 65th Birthday, pages 49–74. World Scientific, 2005.
  88. Functions of bounded variation, signed measures, and a general koksma-hlawka inequality, 2014.
  89. E. Hlawka and R. Mück. Über eine transformation von gleichverteilten folgen ii. Computing, 9(2):127–138, Jun 1972.
  90. A systematic review on overfitting control in shallow and deep neural networks. Artificial Intelligence Review, pages 1–48, 2021.
  91. Statistical lnference. Duxbury press, 2002.
  92. Murray Rosenblatt. Remarks on a multivariate transformation. The Annals of Mathematical Statistics, 23(3):470–472, 1952.
  93. Construction of Uniform Designs on Arbitrary Domains by Inverse Rosenblatt Transformation, pages 111–126. 05 2020.
  94. 2. Quasi-Monte Carlo Methods for Numerical Integration, pages 13–22.
  95. Howard E Haber. Notes on the matrix exponential and logarithm. Santa Cruz Institute for Particle Physics, University of California: Santa Cruz, CA, USA, 2018.
  96. Steven R. White. Density matrix formulation for quantum renormalization groups. Phys. Rev. Lett., 69:2863–2866, 1992.
  97. Matrix concentration for products. arXiv preprint arXiv:2003.05437, 2020.

Summary

We haven't generated a summary for this paper yet.