Hamiltonian Learning using Machine Learning Models Trained with Continuous Measurements
Abstract: We build upon recent work on using Machine Learning models to estimate Hamiltonian parameters using continuous weak measurement of qubits as input. We consider two settings for the training of our model: (1) supervised learning where the weak measurement training record can be labeled with known Hamiltonian parameters, and (2) unsupervised learning where no labels are available. The first has the advantage of not requiring an explicit representation of the quantum state, thus potentially scaling very favorably to larger number of qubits. The second requires the implementation of a physical model to map the Hamiltonian parameters to a measurement record, which we implement using an integrator of the physical model with a recurrent neural network to provide a model-free correction at every time step to account for small effects not captured by the physical model. We test our construction on a system of two qubits and demonstrate accurate prediction of multiple physical parameters in both the supervised and unsupervised context. We demonstrate that the model benefits from larger training sets establishing that it is in fact "learning," and we show robustness to errors in the assumed physical model by achieving accurate parameter estimation in the presence of unanticipated single particle relaxation.
- J. Preskill, Quantum Computing in the NISQ era and beyond, Quantum 2, 79 (2018).
- A. Y. Kitaev, Quantum computations: algorithms and error correction, Russian Mathematical Surveys 52, 1191 (1997).
- J. Preskill, Reliable quantum computers, Proceedings of the Royal Society of London. Series A: Mathematical, Physical and Engineering Sciences 454, 385 (1998).
- E. Knill, R. Laflamme, and W. H. Zurek, Resilient quantum computation, Science 279, 342 (1998).
- D. Gottesman, Theory of fault-tolerant quantum computation, Phys. Rev. A 57, 127 (1998).
- G. Carleo and M. Troyer, Solving the quantum many-body problem with artificial neural networks, Science 355, 602 (2017).
- G. Carleo, Y. Nomura, and M. Imada, Constructing exact representations of quantum many-body systems with deep neural networks, Nature Communications 9, 5322 (2018).
- J. Carrasquilla and G. Torlai, How to use neural networks to investigate quantum many-body physics, PRX Quantum 2, 040201 (2021).
- S. L. Brunton, J. L. Proctor, and J. N. Kutz, Discovering governing equations from data by sparse identification of nonlinear dynamical systems, Proceedings of the National Academy of Sciences 113, 3932 (2016).
- B. Lusch, J. N. Kutz, and S. L. Brunton, Deep learning for universal linear embeddings of nonlinear dynamics, Nature Communications 9, 4950 (2018).
- M. Raissi, P. Perdikaris, and G. E. Karniadakis, Physics informed deep learning (part ii): Data-driven discovery of nonlinear partial differential equations (2017), arXiv:1711.10566 [cs.AI] .
- M. Habiba and B. A. Pearlmutter, Neural ordinary differential equation based recurrent neural network model, in 2020 31st Irish Signals and Systems Conference (ISSC) (2020) pp. 1–6.
- J. A. Smolin, J. M. Gambetta, and G. Smith, Efficient method for computing the maximum-likelihood quantum state from measurements with additive gaussian noise, Phys. Rev. Lett. 108, 070502 (2012).
- I. Goodfellow, Y. Bengio, and A. Courville, Deep Learning (MIT Press, 2016) http://www.deeplearningbook.org.
- K. Jacobs and D. A. Steck, A straightforward introduction to continuous quantum measurement, Contemporary Physics 47, 279 (2006).
- T. A. Brun, A simple model of quantum trajectories, American Journal of Physics 70, 719 (2002).
- R. Bonifacio, P. Schwendimann, and F. Haake, Quantum statistical theory of superradiance. i, Phys. Rev. A 4, 302 (1971).
- S. Hochreiter and J. Schmidhuber, Long short-term memory, Neural Computation 9, 1735 (1997a).
- S. Hochreiter and J. Schmidhuber, Long short-term memory, Neural Comput. 9, 1735–1780 (1997b).
- U. Schollwöck, The density-matrix renormalization group in the age of matrix product states, Annals of Physics 326, 96 (2011).
- J. L. Elman, Finding structure in time, Cognitive Science 14, 179 (1990).
- A. Robinson, F. Fallside, and U. of Cambridge. Engineering Department, The Utility Driven Dynamic Error Propagation Network (University of Cambridge Department of Engineering, 1987).
- P. J. Werbos, Generalization of backpropagation with application to a recurrent gas market model, Neural Networks 1, 339 (1988).
- L. Breiman, Random forests, Machine Learning 45, 5 (2001).
- L. Breiman, Bagging predictors, Machine Learning 24, 123 (1996).
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.