Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Binary stochasticity enabled highly efficient neuromorphic deep learning achieves better-than-software accuracy (2304.12866v1)

Published 25 Apr 2023 in cs.NE, cs.LG, eess.SP, and physics.data-an

Abstract: Deep learning needs high-precision handling of forwarding signals, backpropagating errors, and updating weights. This is inherently required by the learning algorithm since the gradient descent learning rule relies on the chain product of partial derivatives. However, it is challenging to implement deep learning in hardware systems that use noisy analog memristors as artificial synapses, as well as not being biologically plausible. Memristor-based implementations generally result in an excessive cost of neuronal circuits and stringent demands for idealized synaptic devices. Here, we demonstrate that the requirement for high precision is not necessary and that more efficient deep learning can be achieved when this requirement is lifted. We propose a binary stochastic learning algorithm that modifies all elementary neural network operations, by introducing (i) stochastic binarization of both the forwarding signals and the activation function derivatives, (ii) signed binarization of the backpropagating errors, and (iii) step-wised weight updates. Through an extensive hybrid approach of software simulation and hardware experiments, we find that binary stochastic deep learning systems can provide better performance than the software-based benchmarks using the high-precision learning algorithm. Also, the binary stochastic algorithm strongly simplifies the neural network operations in hardware, resulting in an improvement of the energy efficiency for the multiply-and-accumulate operations by more than three orders of magnitudes.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (16)
  1. Yang Li (1142 papers)
  2. Wei Wang (1793 papers)
  3. Ming Wang (59 papers)
  4. Chunmeng Dou (1 paper)
  5. Zhengyu Ma (25 papers)
  6. Huihui Zhou (20 papers)
  7. Peng Zhang (642 papers)
  8. Nicola Lepri (1 paper)
  9. Xumeng Zhang (10 papers)
  10. Qing Luo (15 papers)
  11. Xiaoxin Xu (9 papers)
  12. Guanhua Yang (1 paper)
  13. Feng Zhang (180 papers)
  14. Ling Li (112 papers)
  15. Daniele Ielmini (13 papers)
  16. Ming Liu (421 papers)
Citations (2)