Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
153 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Quantum-Classical Multiple Kernel Learning (2305.17707v1)

Published 28 May 2023 in quant-ph

Abstract: As quantum computers become increasingly practical, so does the prospect of using quantum computation to improve upon traditional algorithms. Kernel methods in machine learning is one area where such improvements could be realized in the near future. Paired with kernel methods like support-vector machines, small and noisy quantum computers can evaluate classically-hard quantum kernels that capture unique notions of similarity in data. Taking inspiration from techniques in classical machine learning, this work investigates simulated quantum kernels in the context of multiple kernel learning (MKL). We consider pairwise combinations of several classical-classical, quantum-quantum, and quantum-classical kernels in an empirical investigation of their classification performance with support-vector machines. We also introduce a novel approach, which we call QCC-net (quantum-classical-convex neural network), for optimizing the weights of base kernels together with any kernel parameters. We show this approach to be effective for enhancing various performance metrics in an MKL setting. Looking at data with an increasing number of features (up to 13 dimensions), we find parameter training to be important for successfully weighting kernels in some combinations. Using the optimal kernel weights as indicators of relative utility, we find growing contributions from trainable quantum kernels in quantum-classical kernel combinations as the number of features increases. We observe the opposite trend for combinations containing simpler, non-parametric quantum kernels.

Summary

We haven't generated a summary for this paper yet.