Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Surrogate gradients for analog neuromorphic computing (2006.07239v3)

Published 12 Jun 2020 in cs.NE, cs.ET, cs.LG, q-bio.NC, and stat.ML

Abstract: To rapidly process temporal information at a low metabolic cost, biological neurons integrate inputs as an analog sum but communicate with spikes, binary events in time. Analog neuromorphic hardware uses the same principles to emulate spiking neural networks with exceptional energy-efficiency. However, instantiating high-performing spiking networks on such hardware remains a significant challenge due to device mismatch and the lack of efficient training algorithms. Here, we introduce a general in-the-loop learning framework based on surrogate gradients that resolves these issues. Using the BrainScaleS-2 neuromorphic system, we show that learning self-corrects for device mismatch resulting in competitive spiking network performance on both vision and speech benchmarks. Our networks display sparse spiking activity with, on average, far less than one spike per hidden neuron and input, perform inference at rates of up to 85 k frames/second, and consume less than 200 mW. In summary, our work sets several new benchmarks for low-energy spiking network processing on analog neuromorphic hardware and paves the way for future on-chip learning algorithms.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (12)
  1. Benjamin Cramer (13 papers)
  2. Sebastian Billaudelle (23 papers)
  3. Simeon Kanya (1 paper)
  4. Aron Leibfried (5 papers)
  5. Andreas GrĂ¼bl (19 papers)
  6. Vitali Karasenko (10 papers)
  7. Christian Pehle (21 papers)
  8. Korbinian Schreiber (8 papers)
  9. Yannik Stradmann (13 papers)
  10. Johannes Weis (11 papers)
  11. Johannes Schemmel (66 papers)
  12. Friedemann Zenke (17 papers)
Citations (94)

Summary

We haven't generated a summary for this paper yet.