Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
121 tokens/sec
GPT-4o
9 tokens/sec
Gemini 2.5 Pro Pro
47 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Quantum Circuit Design for Training Perceptron Models (1802.05428v3)

Published 15 Feb 2018 in quant-ph

Abstract: Perceptron model is a fundamental linear classifier in machine learning and also the building block of artificial neural networks. Recently, Wiebe et al. (arXiv:1602.04799) proposed that the training of a perceptron can be quadratically speeded using Grover search with a quantum computer, which has potentially important big-data applications. In this paper, we design a quantum circuit for implementing this algorithm. The Grover oracle, the central part of the circuit, is realized by Quantum-Fourier-Transform based arithmetics that specifies whether an input weight vector can correctly classify all training data samples. We also analyze the required number of qubits and universal gates for the algorithm, as well as the success probability using uniform sampling, showing that it has higher possibility than spherical Gaussian distribution $N(0,1)$. The feasibility of the circuit is demonstrated by a testing example using the IBM-Q cloud quantum computer, where 16 qubits are used to classify four data samples.

Summary

We haven't generated a summary for this paper yet.