Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
80 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

On the expressivity of embedding quantum kernels (2309.14419v2)

Published 25 Sep 2023 in quant-ph, cs.LG, and stat.ML

Abstract: One of the most natural connections between quantum and classical machine learning has been established in the context of kernel methods. Kernel methods rely on kernels, which are inner products of feature vectors living in large feature spaces. Quantum kernels are typically evaluated by explicitly constructing quantum feature states and then taking their inner product, here called embedding quantum kernels. Since classical kernels are usually evaluated without using the feature vectors explicitly, we wonder how expressive embedding quantum kernels are. In this work, we raise the fundamental question: can all quantum kernels be expressed as the inner product of quantum feature states? Our first result is positive: Invoking computational universality, we find that for any kernel function there always exists a corresponding quantum feature map and an embedding quantum kernel. The more operational reading of the question is concerned with efficient constructions, however. In a second part, we formalize the question of universality of efficient embedding quantum kernels. For shift-invariant kernels, we use the technique of random Fourier features to show that they are universal within the broad class of all kernels which allow a variant of efficient Fourier sampling. We then extend this result to a new class of so-called composition kernels, which we show also contains projected quantum kernels introduced in recent works. After proving the universality of embedding quantum kernels for both shift-invariant and composition kernels, we identify the directions towards new, more exotic, and unexplored quantum kernel families, for which it still remains open whether they correspond to efficient embedding quantum kernels.

An In-Depth Analysis of the Expressivity of Embedding Quantum Kernels

The paper "On the expressivity of embedding quantum kernels" embarks on a foundational exploration into the field of quantum machine learning, particularly focusing on the capacity and limitations of embedding quantum kernels (EQKs). It addresses the central question of whether EQKs encompass the entire spectrum of quantum kernel functions, considering efficiency constraints.

Overview

Embedding quantum kernels are a class of kernel functions derived from quantum feature maps, where classical data is mapped onto the quantum state space, and the kernel is defined via the Hilbert-Schmidt inner product of these states. This formulation is motivated by the capacity of quantum states to provide a high-dimensional feature space amenable to kernel methods. The universality of EQKs is asserted by Theorem 1 of the paper, which shows that any kernel function can be approximated as an EQK, albeit without constraints on computational resources.

Shift-Invariant Kernels and Random Fourier Features

The exploration extends to addressing whether efficient EQK approximations can be universally achieved within specific kernel classes. The paper narrows its scope to shift-invariant kernels, commonly used in classical machine learning, by leveraging the Random Fourier Features (RFF) methodology. Bochner's theorem, pivotal in the RFF method, establishes conditions under which a function is PSD and allows it to be expressed as the Fourier transform of a non-negative measure.

Corollary 1 highlights that, under conditions of smoothness (specifically bounded second derivatives), shift-invariant kernels can indeed be approximated efficiently by EQKs, with the dimensionality of the feature space scaling polynomially with relevant parameters. This result provides a constructive path for embedding-quantum efficiency in scenarios where sampling from the inverse Fourier transform of the kernel is computationally feasible.

Composition Kernels and Beyond

The discourse is broadened to include composition kernels, which generalize shift-invariant kernels by incorporating pre-processing functions into their formulation. Proposition 1 establishes that these too can be efficiently approximated as EQKs, assuming polynomial scaling on certain parameters. This construction successfully encompasses the projected quantum kernel—a recent development that connects classical shadow formalism with kernel methods in QML.

Implications and Speculation

The paper's results have noteworthy implications for the future of quantum machine learning. By confirming that two major classes of kernel functions can be efficiently realized as EQKs, the authors implicitly advocate for the expressive robustness of EQKs. These insights suggest that further exploration into novel quantum kernels might still yield useful models that incorporate structural properties advantageous for specific tasks.

Conclusions and Open Problems

While largely establishing the universality of EQKs, the work initiates a quest for kernel functions that defy efficient EQK realization. Promising future research involves investigating non-shift-invariant and indefinite kernels through the lens of HQPs' spectral properties, as well as variational principles within QML, as suggested by connections to parametrized quantum circuits.

Ultimately, this paper consolidates a theoretical backbone for embedding quantum kernels, while maintaining the allure of unexplored landscapes in quantum kernel methods. The complexity of EQKs fittingly mirrors the sophisticated nature of quantum states, suggesting that future advances in QML may very well rest upon leveraging these intricate structures effectively.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (38)
  1. P. W. Shor, Algorithms for quantum computation: discrete logarithms and factoring, in Proceedings 35th Ann. Symp. Found. Compu. Sc. (IEEE, 1994) pp. 124–134.
  2. M. A. Nielsen and I. L. Chuang, Quantum information and quantum computation (Cambridge university press Cambridge, 2000).
  3. A. Montanaro, Quantum algorithms: an overview, npj Quant. Inf. 2, 15023 (2016).
  4. F. Arute et al., Quantum supremacy using a programmable superconducting processor, Nature 574, 505 (2019).
  5. Y. Wu et al., Strong quantum computational advantage using a superconducting quantum processor, Phys. Rev. Lett. 127, 180501 (2021).
  6. D. Hangleiter and J. Eisert, Computational advantage of quantum random sampling, Rev. Mod. Phys. 95, 035001 (2023).
  7. V. Dunjko and H. J. Briegel, Machine learning & artificial intelligence in the quantum domain: a review of recent progress, Rep. Prog. Phys. 81, 074001 (2018).
  8. M. Schuld and F. Petruccione, Machine Learning with Quantum Computers (Springer International Publishing, 2021).
  9. M. Cerezo et al., Variational quantum algorithms, Nature Rev. Phys. 3, 625 (2021).
  10. K. Bharti et al., Noisy intermediate-scale quantum algorithms, Rev. Mod. Phys. 94, 015004 (2022).
  11. M. Schuld, M. Fingerhuth, and F. Petruccione, Implementing a distance-based classifier with a quantum interference circuit, Europhys. Lett. 119, 60002 (2017).
  12. V. Havlíček et al., Supervised learning with quantum-enhanced feature spaces, Nature 567, 209 (2019).
  13. M. Schuld and N. Killoran, Quantum machine learning in feature Hilbert spaces, Phys. Rev. Lett. 122 4, 040504 (2019).
  14. A. Pérez-Salinas et al., Data re-uploading for a universal quantum classifier, Quantum 4, 226 (2020).
  15. M. Schuld, Quantum machine learning models are kernel methods, arXiv:2101.11020  (2021).
  16. C. Gyurik and V. Dunjko, Structural risk minimization for quantum linear classifiers, Quantum 7, 893 (2023).
  17. S. Shin, Y. S. Teo, and H. Jeong, Analyzing quantum machine learning using tensor network, arXiv:2307.06937  (2023).
  18. Y. Suzuki, H. Kawaguchi, and N. Yamamoto, Quantum Fisher kernel for mitigating the vanishing similarity issue, arXiv:2210.16581  (2022).
  19. R. Mengoni and A. D. Pierro, Kernel methods in quantum machine learning, Quant. Mach. Int. 1, 65 (2019).
  20. M. Schuld, R. Sweke, and J. J. Meyer, Effect of data encoding on the expressive power of variational quantum-machine-learning models, Phys. Rev. A 103, 032430 (2021).
  21. Y. Liu, S. Arunachalam, and K. Temme, A rigorous and robust quantum speed-up in supervised machine learning, Nature Phys. 17, 1013 (2021).
  22. C. A. Micchelli, Y. Xu, and H. Zhang, Universal kernels, J. Mach. Learn. Res. 7, 2651 (2006).
  23. A. W. Harrow, A. Hassidim, and S. Lloyd, Quantum algorithm for linear systems of equations, Phys. Rev. Lett. 103, 150502 (2009).
  24. P. Rebentrost, M. Mohseni, and S. Lloyd, Quantum support vector machine for big data classification, Phys. Rev. Lett. 113, 130503 (2014).
  25. F. J. Schreiber, J. Eisert, and J. J. Meyer, Classical surrogates for quantum learning models, Phys. Rev. Lett. 131, 100803 (2023).
  26. F. Bach, On the equivalence between kernel quadrature rules and random feature expansions, J. Mach. Learn. Res. 18, 1 (2017).
  27. A. Rahimi and B. Recht, Random features for large-scale kernel machines, in Adv. Neur. Inf. Proc. Sys., Vol. 20 (2007).
  28. W. Rudin, Fourier Analysis on Groups (Wiley, 1990).
  29. M. Troyer and U.-J. Wiese, Computational complexity and fundamental limitations to fermionic quantum monte carlo simulations, Phys. Rev. Lett. 94, 170201 (2005).
  30. G. Kalai, The complexity of sampling (approximately) the fourier transform of a boolean function, StackExchange, Theoretical Computer Science  (2013).
  31. M. Schwarz and M. Van den Nest, Simulating quantum circuits with sparse output distributions, arXiv:1310.6749  (2013).
  32. S. Karlin, Total positivity, absorption probabilities and applications, Trans. Am. Math. Soc. 111, 33 (1964).
  33. P. C. Hansen, Discrete Inverse Problems (Society for Industrial and Applied Mathematics, 2010).
  34. C.-W. Ha, Eigenvalues of differentiable positive definite kernels, SIAM J. Math. Ana. 17, 415 (1986).
  35. A. M. Yaglom, Correlation Theory of Stationary and Related Random Functions, Volume I: Basic Results, Vol. 131 (Springer, 1987).
  36. J. Kübler, S. Buchholz, and B. Schölkopf, The inductive bias of quantum kernels, Adv. Neur. Inf. Proc. Sys. 34, 12661 (2021).
  37. E. Peters and M. Schuld, Generalization despite overfitting in quantum machine learning models, arXiv:2209.05523  (2022).
  38. E. Gil-Fuster, How to approximate a classical kernel with a quantum computer, PennyLane demo  (2022).
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Elies Gil-Fuster (13 papers)
  2. Jens Eisert (197 papers)
  3. Vedran Dunjko (97 papers)
Citations (9)
Youtube Logo Streamline Icon: https://streamlinehq.com