Papers
Topics
Authors
Recent
2000 character limit reached

Active online learning in the binary perceptron problem

Published 21 Feb 2019 in cs.LG and cond-mat.dis-nn | (1902.08043v1)

Abstract: The binary perceptron is the simplest artificial neural network formed by $N$ input units and one output unit, with the neural states and the synaptic weights all restricted to $\pm 1$ values. The task in the teacher--student scenario is to infer the hidden weight vector by training on a set of labeled patterns. Previous efforts on the passive learning mode have shown that learning from independent random patterns is quite inefficient. Here we consider the active online learning mode in which the student designs every new Ising training pattern. We demonstrate that it is mathematically possible to achieve perfect (error-free) inference using only $N$ designed training patterns, but this is computationally unfeasible for large systems. We then investigate two Bayesian statistical designing protocols, which require $2.3 N$ and $1.9 N$ training patterns, respectively, to achieve error-free inference. If the training patterns are instead designed through deductive reasoning, perfect inference is achieved using $N!+!\log_{2}!N$ samples. The performance gap between Bayesian and deductive designing strategies may be shortened in future work by taking into account the possibility of ergodicity breaking in the version space of the binary perceptron.

Citations (5)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

Collections

Sign up for free to add this paper to one or more collections.