A Hyperparameter Study for Quantum Kernel Methods (2310.11891v3)
Abstract: Quantum kernel methods are a promising method in quantum machine learning thanks to the guarantees connected to them. Their accessibility for analytic considerations also opens up the possibility of prescreening datasets based on their potential for a quantum advantage. To do so, earlier works developed the geometric difference, which can be understood as a closeness measure between two kernel-based machine learning approaches, most importantly between a quantum kernel and a classical kernel. This metric links the quantum and classical model complexities, and it was developed to bound generalization error. Therefore, it raises the question of how this metric behaves in an empirical setting. In this work, we investigate the effects of hyperparameter choice on the model performance and the generalization gap between classical and quantum kernels. The importance of hyperparameters is well known also for classical machine learning. Of special interest are hyperparameters associated with the quantum Hamiltonian evolution feature map, as well as the number of qubits to trace out before computing a projected quantum kernel. We conduct a thorough investigation of the hyperparameters across 11 datasets and we identify certain aspects that can be exploited. Analyzing the effects of certain hyperparameter settings on the empirical performance, as measured by cross validation accuracy, and generalization ability, as measured by geometric difference described above, brings us one step closer to understanding the potential of quantum kernel methods on classical datasets.
- Tensorflow tutorial on quantum data. https://www.tensorflow.org/quantum/tutorials/quantum_data. Accessed: 2023-09-01.
- The power of quantum neural networks. Nature Computational Science, 1(6):403–409, jun 2021.
- Openml benchmarking suites, 2021.
- Bandwidth enables generalization in quantum kernel models, 2023.
- The born supremacy: quantum advantage and training of an ising born machine. npj Quantum Information, 6(1), jul 2020.
- Toward deeper understanding of neural networks: The power of initialization and a dual view on expressivity, 2017.
- Power of data in quantum machine learning. Nature Communications, 12(1), may 2021.
- The quantum path kernel: a generalized quantum neural tangent kernel for deep quantum machine learning, 2022.
- Neural tangent kernel: Convergence and generalization in neural networks, 2020.
- The inductive bias of quantum kernels. Advances in Neural Information Processing Systems, 34:12661–12673, 2021.
- Deep neural networks as gaussian processes, 2018.
- Why does deep and cheap learning work so well? Journal of Statistical Physics, 168(6):1223–1247, jul 2017.
- Kolby Nottingham Markelle Kelly, Rachel Longjohn. The uci machine learning repository. https://archive.ics.uci.edu.
- Barren plateaus in quantum neural network training landscapes. Nature Communications, 9(1), nov 2018.
- Pooling techniques in hybrid quantum-classical convolutional neural networks, 2023.
- Clinical data classification with noisy intermediate scale quantum computers. Scientific Reports, 12(1):1851, 2022.
- Hyperparameter importance of quantum neural networks across small datasets. In Discovery Science, pages 32–46. Springer Nature Switzerland, 2022.
- Kevin P. Murphy. Machine Learning: A Probabilistic Perspective. MIT Press, 2012.
- Deterministic and random features for large-scale quantum kernel machine, 2022.
- Ali Rad. Deep quantum neural networks are gaussian process, 2023.
- Generation of high-resolution handwritten digits with an ion-trap quantum computer, 2022.
- Maria Schuld. Supervised quantum machine learning models are kernel methods, 2021.
- Effect of data encoding on the expressive power of variational quantum-machine-learning models. Physical Review A, 103(3), mar 2021.
- Importance of kernel bandwidth in quantum machine learning. Physical Review A, 106(4), oct 2022.
- Numerical evidence against advantage with quantum fidelity kernels on classical data. Physical Review A, 107(6), June 2023.
- Exponential concentration and untrainability in quantum kernel methods, 2022.