Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
157 tokens/sec
GPT-4o
43 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Spiking mode-based neural networks (2310.14621v3)

Published 23 Oct 2023 in q-bio.NC, cond-mat.dis-nn, cs.AI, and cs.NE

Abstract: Spiking neural networks play an important role in brain-like neuromorphic computations and in studying working mechanisms of neural circuits. One drawback of training a large scale spiking neural network is that updating all weights is quite expensive. Furthermore, after training, all information related to the computational task is hidden into the weight matrix, prohibiting us from a transparent understanding of circuit mechanisms. Therefore, in this work, we address these challenges by proposing a spiking mode-based training protocol, where the recurrent weight matrix is explained as a Hopfield-like multiplication of three matrices: input, output modes and a score matrix. The first advantage is that the weight is interpreted by input and output modes and their associated scores characterizing the importance of each decomposition term. The number of modes is thus adjustable, allowing more degrees of freedom for modeling the experimental data. This significantly reduces the training cost because of significantly reduced space complexity for learning. Training spiking networks is thus carried out in the mode-score space. The second advantage is that one can project the high dimensional neural activity (filtered spike train) in the state space onto the mode space which is typically of a low dimension, e.g., a few modes are sufficient to capture the shape of the underlying neural manifolds. We successfully apply our framework in two computational tasks -- digit classification and selective sensory integration tasks. Our method accelerate the training of spiking neural networks by a Hopfield-like decomposition, and moreover this training leads to low-dimensional attractor structures of high-dimensional neural dynamics.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (27)
  1. Building functional networks of spiking model neurons. Nature neuroscience. 2016; 19(3):350–355.
  2. Parametric control of flexible timing through low-dimensional neural manifolds. Neuron. 2023; 111(5):739–753.
  3. Geometry of population activity in spiking networks with low-rank structure. PLOS Computational Biology. 2023; 19(8):1–34. https://doi.org/10.1371/journal.pcbi.1011315, doi: 10.1371/journal.pcbi.1011315.
  4. The Heidelberg Spiking Data Sets for the Systematic Evaluation of Spiking Neural Networks. IEEE Transactions on Neural Networks and Learning Systems. 2022; 33(7):2744–2757. doi: 10.1109/TNNLS.2020.3044364.
  5. Using firing-rate dynamics to train recurrent networks of spiking model neurons. arXiv:160107620. 2016; .
  6. Elman JL. Finding structure in time. Cognitive Science. 1990; 14(2):179–211.
  7. Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. United Kingdom: Cambridge University Press; 2014.
  8. Huang H. Statistical Mechanics of Neural Networks. Singapore: Springer; 2022.
  9. Jazayeri M, Ostojic S. Interpreting neural computations by examining intrinsic and embedding dimensionality of neural activity. Current Opinion in Neurobiology. 2021; 70:113–120.
  10. Spectrum of non-Hermitian deep-Hebbian neural networks. Phys Rev Res. 2023; 5:013090.
  11. Associative memory model with arbitrary Hebbian length. Phys Rev E. 2021; 104:064306.
  12. Kim CM, Chow CC. Learning recurrent dynamics in spiking networks. Elife. 2018; 7:e37124.
  13. Simple framework for constructing functional spiking recurrent neural networks. Proceedings of the national academy of sciences. 2019; 116(45):22811–22820.
  14. Kingma DP, Ba J. Adam: A Method for Stochastic Optimization. arXiv:14126980. 2014; .
  15. Li C, Huang H. Emergence of hierarchical modes from deep learning. Phys Rev Res. 2023; 5:L022011.
  16. Context-dependent computation by recurrent dynamics in prefrontal cortex. Nature. 2013; 503(7474):78–84.
  17. Mastrogiuseppe F, Ostojic S. Linking connectivity, dynamics, and computations in low-rank recurrent neural networks. Neuron. 2018; 99(3):609–623.
  18. Miconi T. Biologically plausible learning in recurrent neural networks reproduces neural dynamics observed during cognitive tasks. Elife. 2017; 6:e20899.
  19. Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-Based Optimization to Spiking Neural Networks. IEEE Signal Processing Magazine. 2019; 36(6):51–63.
  20. Nicola W, Clopath C. Supervised learning in spiking neural networks with FORCE training. Nature communications. 2017; 8(1):2208.
  21. Podlaski WF, Machens CK. Approximating nonlinear functions with latent boundaries in low-rank excitatory-inhibitory spiking networks. arXiv:230709334. 2023; .
  22. Fluctuation-driven initialization for spiking neural network training. Neuromorphic Computing and Engineering. 2022; 2(4):044016.
  23. Towards spike-based machine intelligence with neuromorphic computing. Nature. 2019; 575(7784):607–617.
  24. Training excitatory-inhibitory recurrent neural networks for cognitive tasks: a simple and flexible framework. PLoS computational biology. 2016; 12(2):e1004792.
  25. Sussillo D, Abbott LF. Generating coherent patterns of activity from chaotic neural networks. Neuron. 2009; 63(4):544–557.
  26. Learning universal computations with spikes. PLoS computational biology. 2016; 12(6):e1004895.
  27. Werbos PJ. Backpropagation Through Time: What It Does and How to Do It. Proc IEEE. 1990; 78:1550–1560.
Citations (1)

Summary

We haven't generated a summary for this paper yet.

Github Logo Streamline Icon: https://streamlinehq.com
X Twitter Logo Streamline Icon: https://streamlinehq.com