Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
134 tokens/sec
GPT-4o
9 tokens/sec
Gemini 2.5 Pro Pro
47 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Quadratic Quantum Speedup for Perceptron Training (2109.04695v1)

Published 10 Sep 2021 in quant-ph

Abstract: Perceptrons, which perform binary classification, are the fundamental building blocks of neural networks. Given a data set of size~$N$ and margin~$\gamma$ (how well the given data are separated), the query complexity of the best-known quantum training algorithm scales as either $(\nicefrac{\sqrt{N}}{\gamma2})\log(\nicefrac1{\gamma2)}$ or $\nicefrac{N}{\sqrt{\gamma}}$, which is achieved by a hybrid of classical and quantum search. In this paper, we improve the version space quantum training method for perceptrons such that the query complexity of our algorithm scales as $\sqrt{\nicefrac{N}{\gamma}}$. This is achieved by constructing an oracle for the perceptrons using quantum counting of the number of data elements that are correctly classified. We show that query complexity to construct such an oracle has a quadratic improvement over classical methods. Once such an oracle is constructed, bounded-error quantum search can be used to search over the hyperplane instances. The optimality of our algorithm is proven by reducing the evaluation of a two-level AND-OR tree (for which the query complexity lower bound is known) to a multi-criterion search. Our quantum training algorithm can be generalized to train more complex machine learning models such as neural networks, which are built on a large number of perceptrons.

Citations (2)

Summary

We haven't generated a summary for this paper yet.