Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Paired Competing Neurons Improving STDP Supervised Local Learning In Spiking Neural Networks (2308.02194v2)

Published 4 Aug 2023 in cs.CV

Abstract: Direct training of Spiking Neural Networks (SNNs) on neuromorphic hardware has the potential to significantly reduce the energy consumption of artificial neural network training. SNNs trained with Spike Timing-Dependent Plasticity (STDP) benefit from gradient-free and unsupervised local learning, which can be easily implemented on ultra-low-power neuromorphic hardware. However, classification tasks cannot be performed solely with unsupervised STDP. In this paper, we propose Stabilized Supervised STDP (S2-STDP), a supervised STDP learning rule to train the classification layer of an SNN equipped with unsupervised STDP for feature extraction. S2-STDP integrates error-modulated weight updates that align neuron spikes with desired timestamps derived from the average firing time within the layer. Then, we introduce a training architecture called Paired Competing Neurons (PCN) to further enhance the learning capabilities of our classification layer trained with S2-STDP. PCN associates each class with paired neurons and encourages neuron specialization toward target or non-target samples through intra-class competition. We evaluate our methods on image recognition datasets, including MNIST, Fashion-MNIST, and CIFAR-10. Results show that our methods outperform state-of-the-art supervised STDP learning rules, for comparable architectures and numbers of neurons. Further analysis demonstrates that the use of PCN enhances the performance of S2-STDP, regardless of the hyperparameter set and without introducing any additional hyperparameters.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (58)
  1. State-of-the-art in artificial neural network applications: A survey. Heliyon 4
  2. Representation Learning: A Review and New Perspectives. Transactions on Pattern Analysis and Machine Intelligence 35, 1798–1828
  3. Spike Timing–Dependent Plasticity: A Hebbian Learning Rule. Annual Review of Neuroscience 31, 25–46
  4. Backpropagation-Based Learning Techniques for Deep Spiking Neural Networks: A Survey. Transactions on Neural Networks and Learning Systems
  5. Training Spiking Neural Networks Using Lessons From Deep Learning. ArXiv arXiv:2109.12894 [cs.NE]
  6. Improving STDP-based Visual Feature Learning with Whitening. In International Joint Conference on Neural Networks
  7. Unsupervised visual feature learning with spike-timing-dependent plasticity: How far are we from traditional feature learning approaches? Pattern Recognition 93, 418–429
  8. Multi-layered Spiking Neural Network with Target Timestamp Threshold Adaptation and STDP. In International Joint Conference on Neural Networks
  9. Unsupervised feature learning with winner-takes-all based STDP. Frontiers in Computational Neuroscience 12
  10. Neuromodulated Spike-Timing-Dependent Plasticity, and Theory of Three-Factor Learning Rules. Frontiers in Neural Circuits 9
  11. Unsupervised and efficient learning in sparsely activated convolutional spiking neural networks enabled by voltage-dependent synaptic plasticity. Neuromorphic Computing and Engineering 3
  12. Neural Coding in Spiking Neural Networks: A Comparative Study for Robust Neuromorphic Systems. Frontiers in Neuroscience 15
  13. A biologically plausible supervised learning method for spiking neural networks using the symmetric STDP rule. Neural Networks 121, 387–395
  14. Hebb, D. (1949). The Organization of Behavior (Springer, Berlin, Heidelberg)
  15. Memristors for Energy-Efficient New Computing Paradigms. Advanced Electronic Materials 2
  16. Spike-based local synaptic plasticity: A survey of computational models and neuromorphic circuits. Neuromorphic Computing and Engineering 3
  17. STDP-based spiking deep convolutional neural networks for object recognition. Neural Networks 99, 56–67
  18. Temporal Backpropagation for Spiking Neural Networks with One Spike per Neuron. International Journal of Neural Systems 30
  19. Krizhevsky, A. (2009). Learning Multiple Layers of Features from Tiny Images. Tech. rep., University of Toronto, USA
  20. Gradient-based learning applied to document recognition. Proceedings of the IEEE 86, 2278–2323
  21. Training Deep Spiking Convolutional Neural Networks With STDP-Based Unsupervised Pre-training Followed by Supervised Fine-Tuning. Frontiers in Neuroscience 12
  22. Deep Spiking Convolutional Neural Network Trained With Unsupervised Spike-Timing-Dependent Plasticity. Transactions on Cognitive and Developmental Systems 11, 384–394
  23. In Situ Learning in Hardware Compatible Multilayer Memristive Spiking Neural Network. Transactions on Cognitive and Developmental Systems 14, 448–461
  24. The impact of encoding–decoding schemes and weight normalization in spiking neural networks. Neural Networks 108, 365–378
  25. Low-Power Computing with Neuromorphic Engineering. Advanced Intelligent Systems 3
  26. SSTDP: Supervised Spike Timing Dependent Plasticity for Efficient Spiking Neural Network Training. Frontiers in Neuroscience 15
  27. Temporal Dependent Local Learning for Deep Spiking Neural Networks. In International Joint Conference on Neural Networks
  28. Memristive and CMOS Devices for Neuromorphic Computing. Materials 13
  29. STiDi-BP: Spike time displacement based error backpropagation in multilayer spiking neural networks. Neurocomputing 427, 131–140
  30. Spike time displacement-based error backpropagation in convolutional spiking neural networks. Neural Computing and Applications 35, 15891–15906
  31. Bio-inspired digit recognition using reward-modulated spike-timing-dependent plasticity in deep convolutional networks. Pattern Recognition 94
  32. Event-Driven Random Back-Propagation: Enabling Neuromorphic Deep Learning Machines. Frontiers in Neuroscience 11
  33. T2FSNN: Deep Spiking Neural Networks with Time-to-first-spike Coding. In Design Automation Conference
  34. Supervised Learning in Spiking Neural Networks with ReSuMe: Sequence Learning, Classification, and Spike Shifting. Neural Computation 22, 467–510
  35. Introduction to spiking neural networks: Information processing, learning and applications. Acta Neurobiologiae Experimentalis 71, 409–433
  36. Simulation of a memristor-based spiking neural network immune to device variations. In International Joint Conference on Neural Networks. 1775–1781
  37. Rate Coding Versus Temporal Order Coding: What the Retinal Ganglion Cells Tell the Visual Cortex. Neural Computation 13, 1255–1283
  38. Plasticity in memristive devices for spiking neural networks. Frontiers in Neuroscience 9
  39. Assembly-based STDP: A New Learning Rule for Spiking Neural Networks Inspired by Biological Assemblies. In International Joint Conference on Neural Networks
  40. A Survey of Neuromorphic Computing and Neural Networks in Hardware. ArXiv arXiv:1705.06963 [cs.NE]
  41. Stable spike-timing dependent plasticity rule for multilayer unsupervised and supervised learning. In International Joint Conference on Neural Networks. 1999–2006
  42. A Survey on Neuromorphic Computing: Models and Hardware. Circuits and Systems Magazine 22, 6–35
  43. In-Hardware Learning of Multilayer Spiking Neural Networks on a Neuromorphic Processor. In Design Automation Conference. 367–372
  44. Approximating Back-propagation for a Biologically Plausible Local Learning Rule in Spiking Neural Networks. In International Conference on Neuromorphic Systems
  45. ReStoCNet: Residual Stochastic Binary Convolutional Spiking Neural Network for Memory-Efficient Neuromorphic Computing. Frontiers in Neuroscience 13
  46. BP-STDP: Approximating backpropagation using spike timing dependent plasticity. Neurocomputing 330, 39–47
  47. A Minimal Spiking Neural Network to Rapidly Train and Classify Handwritten Digits in Binary and 10-Digit Tasks. In International Journal of Advanced Research in Artificial Intelligence
  48. Multi-layer unsupervised learning in a spiking convolutional neural network. In International Joint Conference on Neural Networks. 2023–2030
  49. Event-Based, Timescale Invariant Unsupervised Online Deep Learning With STDP. Frontiers in Computational Neuroscience 12
  50. Spike-based strategies for rapid processing. Neural Networks 14, 715–725
  51. Fashion-MNIST: A Novel Image Dataset for Benchmarking Machine Learning Algorithms. ArXiv arXiv:1708.07747 [cs.LG]
  52. Advances in Memristor-Based Neural Networks. Frontiers in Nanotechnology 3
  53. Fully hardware-implemented memristor convolutional neural network. Nature 577, 641–646
  54. SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30, 1514–1541
  55. Brain-Inspired Learning on Neuromorphic Substrates. Proceedings of the IEEE 109, 935–950
  56. Tuning Convolutional Spiking Neural Network With Biologically Plausible Reward Propagation. Transactions on Neural Networks and Learning Systems 33, 7621–7631
  57. GLSNN: A Multi-Layer Spiking Neural Network Based on Global Feedback Alignment and Local STDP Plasticity. Frontiers in Computational Neuroscience 14
  58. Breaking the von Neumann bottleneck: Architecture-level processing-in-memory technology. Science China Information Sciences 64
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Gaspard Goupy (3 papers)
  2. Pierre Tirilly (10 papers)
  3. Ioan Marius Bilasco (16 papers)
Citations (2)

Summary

We haven't generated a summary for this paper yet.