Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 86 tok/s
Gemini 2.5 Pro 49 tok/s Pro
GPT-5 Medium 17 tok/s Pro
GPT-5 High 14 tok/s Pro
GPT-4o 88 tok/s Pro
GPT OSS 120B 471 tok/s Pro
Kimi K2 207 tok/s Pro
2000 character limit reached

Quantum Neuron with Separable-State Encoding (2202.08306v1)

Published 16 Feb 2022 in quant-ph

Abstract: The use of advanced quantum neuron models for pattern recognition applications requires fault tolerance. Therefore, it is not yet possible to test such models on a large scale in currently available quantum processors. As an alternative, we propose a quantum perceptron (QP) model that uses a reduced number of multi-qubit gates and is therefore less susceptible to quantum errors in current actual quantum computers with limited tolerance. The proposed quantum algorithm is superior to its classical counterpart, although since it does not take full advantage of quantum entanglement, it provides a lower encoding power than other quantum algorithms using multiple qubit entanglement. However, the use of separable-sate encoding allows for testing the algorithm and different training schemes at a large scale in currently available non-fault tolerant quantum computers. We demonstrate the performance of the proposed model by implementing a few qubits version of the QP in a simulated quantum computer. The proposed QP uses an N-ary encoding of the binary input data characterizing the patterns. We develop a hybrid (quantum-classical) training procedure for simulating the learning process of the QP and test their efficiency.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Ai Generate Text Spark Streamline Icon: https://streamlinehq.com

Paper Prompts

Sign up for free to create and run prompts on this paper using GPT-5.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.