Papers
Topics
Authors
Recent
2000 character limit reached

Quantum Circuit Design for Training Perceptron Models

Published 15 Feb 2018 in quant-ph | (1802.05428v3)

Abstract: Perceptron model is a fundamental linear classifier in machine learning and also the building block of artificial neural networks. Recently, Wiebe et al. (arXiv:1602.04799) proposed that the training of a perceptron can be quadratically speeded using Grover search with a quantum computer, which has potentially important big-data applications. In this paper, we design a quantum circuit for implementing this algorithm. The Grover oracle, the central part of the circuit, is realized by Quantum-Fourier-Transform based arithmetics that specifies whether an input weight vector can correctly classify all training data samples. We also analyze the required number of qubits and universal gates for the algorithm, as well as the success probability using uniform sampling, showing that it has higher possibility than spherical Gaussian distribution $N(0,1)$. The feasibility of the circuit is demonstrated by a testing example using the IBM-Q cloud quantum computer, where 16 qubits are used to classify four data samples.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (3)

Collections

Sign up for free to add this paper to one or more collections.