Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

On the role of synaptic stochasticity in training low-precision neural networks (1710.09825v2)

Published 26 Oct 2017 in cond-mat.dis-nn, cs.LG, cs.NE, and stat.ML

Abstract: Stochasticity and limited precision of synaptic weights in neural network models are key aspects of both biological and hardware modeling of learning processes. Here we show that a neural network model with stochastic binary weights naturally gives prominence to exponentially rare dense regions of solutions with a number of desirable properties such as robustness and good generalization performance, while typical solutions are isolated and hard to find. Binary solutions of the standard perceptron problem are obtained from a simple gradient descent procedure on a set of real values parametrizing a probability distribution over the binary synapses. Both analytical and numerical results are presented. An algorithmic extension aimed at training discrete deep neural networks is also investigated.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Carlo Baldassi (36 papers)
  2. Federica Gerace (13 papers)
  3. Hilbert J. Kappen (22 papers)
  4. Carlo Lucibello (38 papers)
  5. Luca Saglietti (21 papers)
  6. Enzo Tartaglione (68 papers)
  7. Riccardo Zecchina (48 papers)
Citations (21)

Summary

We haven't generated a summary for this paper yet.