Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
Gemini 2.5 Pro
GPT-5
GPT-4o
DeepSeek R1 via Azure
2000 character limit reached

Quantum Kernel Machine Learning With Continuous Variables (2401.05647v6)

Published 11 Jan 2024 in quant-ph

Abstract: The popular qubit framework has dominated recent work on quantum kernel machine learning, with results characterising expressivity, learnability and generalisation. As yet, there is no comparative framework to understand these concepts for continuous variable (CV) quantum computing platforms. In this paper we represent CV quantum kernels as closed form functions and use this representation to provide several important theoretical insights. We derive a general closed form solution for all CV quantum kernels and show every such kernel can be expressed as the product of a Gaussian and an algebraic function of the parameters of the feature map. Furthermore, in the multi-mode case, we present quantification of a quantum-classical separation for all quantum kernels via a hierarchical notion of the ``stellar rank" of the quantum kernel feature map. We then prove kernels defined by feature maps of infinite stellar rank, such as GKP-state encodings, can be approximated arbitrarily well by kernels defined by feature maps of finite stellar rank. Finally, we simulate learning with a single-mode displaced Fock state encoding and show that (i) accuracy on our specific task (an annular data set) increases with stellar rank, (ii) for underfit models, accuracy can be improved by increasing a bandwidth hyperparameter, and (iii) for noisy data that is overfit, decreasing the bandwidth will improve generalisation but does so at the cost of effective stellar rank.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (26)
  1. M. Schuld, Supervised quantum machine learning models are kernel methods (2021), arXiv:2101.11020 [quant-ph] .
  2. M. Schuld and N. Killoran, Quantum machine learning in feature hilbert spaces, Phys. Rev. Lett. 122, 040504 (2019).
  3. M. P. SCHULD, Supervised learning with Quantum Computers (SPRINGER, 2019).
  4. P. Wittek, Quantum Machine Learning: What quantum computing means to data mining (Academic Press is an imprint of Elsevier, 2016).
  5. I. Goodfellow, Y. Bengio, and A. Courville, Deep Learning (MIT Press, 2016).
  6. J. Mitrovic, D. Sejdinovic, and Y. W. Teh, Causal inference via kernel deviance measures, Advances in neural information processing systems 31 (2018).
  7. S. An, W. Liu, and S. Venkatesh, Face recognition using kernel ridge regression, in 2007 IEEE Conference on Computer Vision and Pattern Recognition (2007) pp. 1–7.
  8. W. Liu, I. Park, and J. C. Principe, An information theoretic approach of designing sparse kernel adaptive filters, IEEE Transactions on Neural Networks 20, 1950 (2009).
  9. S. Akaho, A kernel method for canonical correlation analysis (2007), arXiv:cs/0609071 [cs.LG] .
  10. C. Gyurik, v. Dyon Vreumingen, and V. Dunjko, Structural risk minimization for quantum linear classifiers, Quantum 7, 893 (2023).
  11. Y. Liu, S. Arunachalam, and K. Temme, A rigorous and robust quantum speed-up in supervised machine learning, Nature Physics 17, 1013 (2021).
  12. J. M. Kübler, S. Buchholz, and B. Scholkopf, The inductive bias of quantum kernels, in Neural Information Processing Systems (2021).
  13. A. Canatar, Statistical Mechanics of Generalization in Kernel Regression and Wide Neural Networks, Doctoral dissertation, Harvard University Graduate School of Arts and Sciences (2022).
  14. R. Ghobadi, Nonclassical kernels in continuous-variable systems, Phys. Rev. A 104, 052403 (2021).
  15. C. Bowie, S. Shrapnel, and M. J. Kewming, Quantum kernel evaluation via hong–ou–mandel interference, Quantum Science and Technology 9, 015001 (2023).
  16. U. Chabaud and S. Mehraban, Holomorphic representation of quantum computations, Quantum 6, 831 (2022).
  17. U. Chabaud and M. Walschaers, Resources for bosonic quantum computational advantage, Physical Review Letters 130, 10.1103/physrevlett.130.090602 (2023).
  18. T. Hofmann, B. Schölkopf, and A. J. Smola, Kernel methods in machine learning, The Annals of Statistics 36, 10.1214/009053607000000677 (2008).
  19. B. Schölkopf, R. Herbrich, and A. J. Smola, A generalized representer theorem, Lecture Notes in Computer Science , 416–426 (2001).
  20. J. Shawe-Taylor and N. Cristianini, Kernel Methods for Pattern Analysis (Cambridge University Press, 2004).
  21. O. Allerbo and R. Jörnsten, Bandwidth selection for gaussian kernel ridge regression via jacobian control (2023), arXiv:2205.11956 [stat.ML] .
  22. D. Gottesman, The Heisenberg representation of quantum computers,   (1998).
  23. C. Scott and K. Greenewald, Universal consistency of svms and other kernel methods (2014).
  24. V. Bargmann, On a hilbert space of analytic functions and an associated integral transform part i, Communications on Pure and Applied Mathematics 14, 187–214 (1961).
  25. V. I. Paulsen and M. Raghupathi, An introduction to the theory of reproducing kernel hilbert spaces (Cambridge University Press, 2016).
  26. E. W. Weisstein, Binomial coefficient, visited on 05/01/24.
Citations (1)

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.