Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Local learning through propagation delays in spiking neural networks (2211.08397v1)

Published 27 Oct 2022 in cs.NE, cs.AI, cs.LG, and q-bio.NC

Abstract: We propose a novel local learning rule for spiking neural networks in which spike propagation times undergo activity-dependent plasticity. Our plasticity rule aligns pre-synaptic spike times to produce a stronger and more rapid response. Inputs are encoded by latency coding and outputs decoded by matching similar patterns of output spiking activity. We demonstrate the use of this method in a three-layer feedfoward network with inputs from a database of handwritten digits. Networks consistently improve their classification accuracy after training, and training with this method also allowed networks to generalize to an input class unseen during training. Our proposed method takes advantage of the ability of spiking neurons to support many different time-locked sequences of spikes, each of which can be activated by different input activations. The proof-of-concept shown here demonstrates the great potential for local delay learning to expand the memory capacity and generalizability of spiking neural networks.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Jørgen Jensen Farner (2 papers)
  2. Ola Huse Ramstad (3 papers)
  3. Stefano Nichele (30 papers)
  4. Kristine Heiney (7 papers)

Summary

We haven't generated a summary for this paper yet.