Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Efficient Parameter Optimisation for Quantum Kernel Alignment: A Sub-sampling Approach in Variational Training (2401.02879v2)

Published 5 Jan 2024 in quant-ph and cs.LG

Abstract: Quantum machine learning with quantum kernels for classification problems is a growing area of research. Recently, quantum kernel alignment techniques that parameterise the kernel have been developed, allowing the kernel to be trained and therefore aligned with a specific dataset. While quantum kernel alignment is a promising technique, it has been hampered by considerable training costs because the full kernel matrix must be constructed at every training iteration. Addressing this challenge, we introduce a novel method that seeks to balance efficiency and performance. We present a sub-sampling training approach that uses a subset of the kernel matrix at each training step, thereby reducing the overall computational cost of the training. In this work, we apply the sub-sampling method to synthetic datasets and a real-world breast cancer dataset and demonstrate considerable reductions in the number of circuits required to train the quantum kernel while maintaining classification accuracy.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (37)
  1. Kernel methods in machine learning. The Annals of Statistics, 36(3):1171 – 1220, 2008.
  2. Theoretical foundations of the potential function method in pattern recognition. Avtomat. i Telemeh, 25(6):917–936, 1964.
  3. A training algorithm for optimal margin classifiers. In Proceedings of the fifth annual workshop on Computational learning theory, pages 144–152, 1992.
  4. On kernel-target alignment. In T. Dietterich, S. Becker, and Z. Ghahramani, editors, Advances in Neural Information Processing Systems, volume 14. MIT Press, 2001.
  5. An overview of kernel alignment and its applications. Artificial Intelligence Review, 43:179–192, 2015.
  6. Algorithms for learning kernels based on centered alignment. The Journal of Machine Learning Research, 13(1):795–828, 2012.
  7. Scalable kernel methods via doubly stochastic gradients. In Z. Ghahramani, M. Welling, C. Cortes, N. Lawrence, and K.Q. Weinberger, editors, Advances in Neural Information Processing Systems, volume 27. Curran Associates, Inc., 2014.
  8. A rigorous and robust quantum speed-up in supervised machine learning. Nature Physics, 17(9):1013–1017, 2021.
  9. Machine learning with quantum computers. Springer, 2021.
  10. Quantum machine learning. Nature, 549(7671):195–202, 2017.
  11. Demonstration of quantum advantage in machine learning. npj Quantum Information, 3(1):16, 2017.
  12. Quantum machine learning in feature hilbert spaces. Physical review letters, 122(4):040504, 2019.
  13. Kernel methods in quantum machine learning. Quantum Machine Intelligence, 1(3-4):65–71, 2019.
  14. Supervised learning with quantum-enhanced feature spaces. Nature, 567(7747):209–212, 2019.
  15. Power of data in quantum machine learning. Nature Communications, 12(1):2631, 2021.
  16. Quantum machine learning framework for virtual screening in drug discovery: a prospective quantum advantage. Machine Learning: Science and Technology, 4(1):015023, 2023.
  17. Application of quantum machine learning using the quantum kernel algorithm on high energy physics analysis at the lhc. Physical Review Research, 3(3):033221, 2021.
  18. Quantum phase recognition via quantum kernel methods. Quantum, 7:981, 2023.
  19. The complexity of quantum support vector machines. arXiv:2203.00031, 2022.
  20. Quantum kernel alignment with stochastic gradient descent. arXiv:2304.09899, 2023.
  21. Quantum multiple kernel learning in financial classification tasks. arXiv:2312.00260, 2023.
  22. Exponential concentration and untrainability in quantum kernel methods. arXiv:2208.11060, 2022.
  23. Challenges and opportunities in quantum machine learning. Nature Computational Science, 2(9):567–576, 2022.
  24. The inductive bias of quantum kernels. Advances in Neural Information Processing Systems, 34:12661–12673, 2021.
  25. Quantum tangent kernel. arXiv:2111.02951, 2021.
  26. Bandwidth enables generalization in quantum kernel models. arXiv:2206.06686, 2022.
  27. Importance of kernel bandwidth in quantum machine learning. Physical Review A, 106(4):042407, 2022.
  28. Training quantum embedding kernels on near-term quantum computers. Physical Review A, 106(4):042431, 2022.
  29. Covariant quantum kernels for data with group structure. arXiv:2105.03406, 2021.
  30. A quantum approximate optimization algorithm. 2014.
  31. Léon Bottou. Large-scale machine learning with stochastic gradient descent. In Proceedings of COMPSTAT’2010, pages 177–186. Springer, 2010.
  32. James C Spall. Implementation of the simultaneous perturbation algorithm for stochastic optimization. IEEE Transactions on Aerospace and Electronic Systems, 34(3):817–823, 1998.
  33. Adam: A method for stochastic optimization. 2014.
  34. Qiskit: An Open-source Framework for Quantum Computing, February 2019.
  35. Hierarchical graph representations in digital pathology. Medical image analysis, 75:102264, 2022.
  36. Karen O’Leary. Precision medicine for advanced breast cancer. Nature Medicine, 2022.
  37. A survey on graph-based deep learning for computational histopathology. Computerized Medical Imaging and Graphics, 95:102027, 2022.
Citations (4)

Summary

We haven't generated a summary for this paper yet.