Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
173 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

SparseProp: Efficient Event-Based Simulation and Training of Sparse Recurrent Spiking Neural Networks (2312.17216v1)

Published 28 Dec 2023 in q-bio.NC, cs.AI, cs.LG, cs.NE, and stat.ML

Abstract: Spiking Neural Networks (SNNs) are biologically-inspired models that are capable of processing information in streams of action potentials. However, simulating and training SNNs is computationally expensive due to the need to solve large systems of coupled differential equations. In this paper, we introduce SparseProp, a novel event-based algorithm for simulating and training sparse SNNs. Our algorithm reduces the computational cost of both the forward and backward pass operations from O(N) to O(log(N)) per network spike, thereby enabling numerically exact simulations of large spiking networks and their efficient training using backpropagation through time. By leveraging the sparsity of the network, SparseProp eliminates the need to iterate through all neurons at each spike, employing efficient state updates instead. We demonstrate the efficacy of SparseProp across several classical integrate-and-fire neuron models, including a simulation of a sparse SNN with one million LIF neurons. This results in a speed-up exceeding four orders of magnitude relative to previous event-based implementations. Our work provides an efficient and exact solution for training large-scale spiking neural networks and opens up new possibilities for building more sophisticated brain-inspired models.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (88)
  1. Conversion of artificial recurrent neural networks to spiking neural networks for low-power neuromorphic hardware. In 2016 IEEE International Conference on Rebooting Computing (ICRC), pages 1–8, October 2016.
  2. Payal Dhar. The carbon impact of artificial intelligence. Nature Machine Intelligence, 2(8):423–425, August 2020.
  3. Rapid Neural Coding in the Retina with Relative Spike Latencies. Science, 319(5866):1108–1111, February 2008.
  4. The tempotron: a neuron that learns spike timing–based decisions. Nature Neuroscience, 9(3):420–428, March 2006.
  5. Robert Gütig. Spiking neurons can discover predictive features by aggregate-label learning. Science, 351(6277):aab4113, March 2016.
  6. On Numerical Simulations of Integrate-and-Fire Neural Networks. Neural Computation, 10(2):467–483, February 1998.
  7. Attractor neural networks with biological probe records. Network: Computation in Neural Systems, 1(4):381, October 1990.
  8. M V Tsodyks and T Sejnowski. Rapid state switching in balanced cortical network models. Network: Computation in Neural Systems, 6(2):111–124, January 1995.
  9. Nicolas Brunel. Dynamics of Sparsely Connected Networks of Excitatory and Inhibitory Spiking Neurons. Journal of Computational Neuroscience, 8(3):183–208, May 2000.
  10. Dynamical Entropy Production in Spiking Neuron Networks in the Balanced State. Physical Review Letters, 105(26):268104, December 2010.
  11. Dynamic Flux Tubes Form Reservoirs of Stability in Neuronal Circuits. Physical Review X, 2(4):041007, November 2012.
  12. Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition. Cambridge University Press, Cambridge, uk ed. edition edition, September 2014.
  13. The Asynchronous State in Cortical Circuits. Science, 327(5965):587–590, January 2010.
  14. Boosting of neural circuit chaos at the onset of collective oscillations. eLife, 12, November 2023.
  15. Synchronization of Pulse-Coupled Biological Oscillators. SIAM Journal on Applied Mathematics, 50(6):1645–1662, December 1990.
  16. Pattern of synchrony in inhomogeneous networks of oscillators with pulse interactions. Physical Review Letters, 71(8):1280–1283, August 1993.
  17. Synchronization Induced by Temporal Delays in Pulse-Coupled Oscillators. Physical Review Letters, 74(9):1570–1573, February 1995.
  18. Unstable attractors induce perpetual synchronization and desynchronization. Chaos (Woodbury, N.Y.), 13(1):377–387, March 2003.
  19. Michael Monteforte. Chaotic Dynamics in Networks of Spiking Neurons in the Balanced State. PhD thesis, Georg-August-University, Göttingen, May 2011.
  20. Alexander Schmidt. Characterization of the phase space structure in driven neural networks in the balanced state. Master’s thesis, Georg-August-University, Göttingen, 2016.
  21. Statistical mechanics of spike events underlying phase space partitioning and sequence codes in large-scale models of neural circuits. Physical Review E, 99(5):052402, May 2019.
  22. David Goldberg. What every computer scientist should know about floating-point arithmetic. ACM Computing Surveys (CSUR), 23(1):5–48, March 1991.
  23. Algorithms. Addison-Wesley Professional, Upper Saddle River, NJ, 4th edition edition, March 2011.
  24. Introduction to Algorithms, 3rd Edition. The MIT Press, Cambridge, Mass, 3rd edition edition, July 2009.
  25. Dylan Festa. Chaos Characterization of Pulse-Coupled Neural Networks in Balanced State. Master’s thesis, MPI DS / Università di Pisa, Göttingen/Pisa, 2011.
  26. Joscha Liedtke. Geometry and organization of stable and unstable manifold in balanced networks. Master’s thesis, Georg-August-University, Göttingen, 2013.
  27. Maximilian Puelma Touzel. Cellular dynamics and stable chaos in balanced networks. PhD thesis, Georg-August-University Göttingen, January 2016.
  28. G. Ermentrout and N. Kopell. Parabolic Bursting in an Excitable System Coupled with a Slow Oscillation. SIAM Journal on Applied Mathematics, 46(2):233–253, April 1986.
  29. Bard Ermentrout. Type I Membranes, Phase Resetting Curves, and Synchrony. Neural Computation, 8(5):979–1001, July 1996.
  30. Dynamics of Membrane Excitability Determine Interspike Interval Variability: A Link Between Spike Generation Mechanisms and Cortical Spike Train Statistics. Neural Computation, 10(5):1047–1065, July 1998.
  31. Eugene M. Izhikevich. Dynamical Systems in Neuroscience. MIT Press, 2007.
  32. Two dimensional synaptically generated traveling waves in a theta-neuron neural network. Neurocomputing, 38–40:789–795, June 2001.
  33. Transient termination of spiking by noise in coupled neurons. EPL (Europhysics Letters), 81(2):20005, 2008.
  34. D. Hansel and G. Mato. Asynchronous States and the Emergence of Synchrony in Large Networks of Interacting Excitatory and Inhibitory Neurons. Neural Computation, 15(1):1–56, January 2003.
  35. Intrinsic Dynamics in Neuronal Networks. II. Experiment. Journal of Neurophysiology, 83(2):828–835, February 2000.
  36. How Spike Generation Mechanisms Determine the Neuronal Response to Fluctuating Inputs. The Journal of Neuroscience, 23(37):11628–11640, December 2003.
  37. Romain Brette. Exact Simulation of Integrate-and-Fire Models with Exponential Currents. Neural Computation, 19(10):2604–2609, August 2007.
  38. Stable Irregular Dynamics in Complex Neural Networks. Physical Review Letters, 100(4):048102, January 2008.
  39. How chaotic is the balanced state? Frontiers in Computational Neuroscience, 3:13, 2009.
  40. Very long transients, irregular firing, and chaotic dynamics in networks of randomly connected inhibitory integrate-and-fire neurons. Physical Review E, 79(3):031909, March 2009.
  41. The Cell-Type Specific Cortical Microcircuit: Relating Structure and Activity in a Full-Scale Spiking Network Model. Cerebral Cortex, December 2012.
  42. Sander M. Bohte. Error-Backpropagation in Networks of Fractionally Predictive Spiking Neurons. In Timo Honkela, Włodzisław Duch, Mark Girolami, and Samuel Kaski, editors, Artificial Neural Networks and Machine Learning – ICANN 2011, Lecture Notes in Computer Science, pages 60–68, Berlin, Heidelberg, 2011. Springer.
  43. Error-backpropagation in temporally encoded networks of spiking neurons. Neurocomputing, 48(1):17–37, October 2002.
  44. Olaf Booij and Hieu tat Nguyen. A gradient descent rule for spiking neurons emitting multiple spikes. Information Processing Letters, 95(6):552–558, September 2005.
  45. Surrogate gradient learning in spiking neural networks: Bringing the power of gradient-based optimization to spiking neural networks. IEEE Signal Processing Magazine, 36(6):51–63, 2019.
  46. Event-based backpropagation can compute exact gradients for spiking neural networks. Scientific Reports, 11(1):12829, June 2021.
  47. L. S. Pontryagin. Mathematical Theory of Optimal Processes: The Mathematical Theory of Optimal Processes. Routledge, New York, 1st edition edition, March 1987.
  48. Daniel Liberzon. Calculus of Variations and Optimal Control Theory: A Concise Introduction. In Calculus of Variations and Optimal Control Theory. Princeton University Press, December 2011.
  49. Loss shaping enhances exact gradient learning with EventProp in Spiking Neural Networks. Technical report, December 2022. ADS Bibcode: 2022arXiv221201232N Type: article.
  50. Smooth Exact Gradient Descent Learning in Spiking Neural Networks. 2023.
  51. Event-Driven Simulations of Nonlinear Integrate-and-Fire Neurons. Neural Computation, 19(12):3226–3238, October 2007.
  52. SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation, 30(6):1514–1541, June 2018.
  53. The components of membrane conductance in the giant axon of Loligo. The Journal of Physiology, 116(4):473–496, April 1952.
  54. Currents carried by sodium and potassium ions through the membrane of the giant axon of Loligo. The Journal of Physiology, 116(4):449–472, April 1952.
  55. Gamma Oscillation by Synaptic Inhibition in a Hippocampal Interneuronal Network Model. The Journal of Neuroscience, 16(20):6402–6413, October 1996.
  56. An algorithm for merging meaps. Acta Informatica, 22(2):171–186, June 1985.
  57. Fibonacci heaps and their uses in improved network optimization algorithms. Journal of the ACM (JACM), 34(3):596–615, July 1987.
  58. The pairing heap: A new form of self-adjusting heap. Algorithmica, 1(1-4):111–129, November 1986.
  59. Gerth Stølting Brodal. Worst-case efficient priority queues. Proceedings of the Seventh Annual Acm-siam Symposium on Discrete Algorithms, pages 52–58, 1996.
  60. SpikeGPT: Generative Pre-trained Language Model with Spiking Neural Networks. Technical report, February 2023. ADS Bibcode: 2023arXiv230213939Z Type: article.
  61. Bo PENG. RWKV-LM, August 2021.
  62. Object Detection with Spiking Neural Networks on Automotive Event Data. In 2022 International Joint Conference on Neural Networks (IJCNN), pages 1–8, July 2022. ISSN: 2161-4407.
  63. Travis DeWolf. Spiking neural networks take control. Science Robotics, 6(58):eabk3268, September 2021.
  64. Training Spiking Deep Networks for Neuromorphic Hardware. 2016. arXiv:1611.05141 [cs].
  65. A solution to the learning dilemma for recurrent networks of spiking neurons. Nature Communications, 11(1):3625, July 2020.
  66. Temporal Coding in Spiking Neural Networks With Alpha Synaptic Function: Learning With Backpropagation. IEEE Transactions on Neural Networks and Learning Systems, 33(10):5939–5952, October 2022.
  67. Accurate online training of dynamical spiking neural networks through Forward Propagation Through Time. Nature Machine Intelligence, pages 1–10, May 2023.
  68. A million spiking-neuron integrated circuit with a scalable communication network and interface. Science, 345(6197):668–673, August 2014.
  69. Fast and energy-efficient neuromorphic deep learning with first-spike times. Nature Machine Intelligence, 3(9):823–835, September 2021.
  70. Event-based Backpropagation for Analog Neuromorphic Hardware. Technical report, February 2023. ADS Bibcode: 2023arXiv230207141P Type: article.
  71. A unified framework of online learning algorithms for training recurrent neural networks. The Journal of Machine Learning Research, 21(1):135:5320–135:5353, January 2020.
  72. James M Murray. Local online learning in recurrent networks with random feedback. eLife, 8:e43299, May 2019.
  73. Burst-dependent synaptic plasticity can coordinate learning in hierarchical circuits. Nature Neuroscience, 24(7):1010–1019, July 2021.
  74. Biologically-plausible backpropagation through arbitrary timespans via local neuromodulators. October 2022.
  75. Dynamics and computation in mixed networks containing neurons that accelerate towards spiking. Physical Review. E, 100(4-1):042404, October 2019.
  76. Large-scale model of mammalian thalamocortical systems. Proceedings of the National Academy of Sciences, 105(9):3593–3598, 2008.
  77. Balanced Networks of Spiking Neurons with Spatially Dependent Recurrent Connections. Physical Review X, 4(2):021039, May 2014.
  78. Cortical Cells Should Fire Regularly, But Do Not. Neural Computation, 4(5):643–646, September 1992.
  79. W. R. Softky and C. Koch. The highly irregular firing of cortical cells is inconsistent with temporal integration of random EPSPs. The Journal of Neuroscience, 13(1):334–350, January 1993.
  80. Synaptic noise and other sources of randomness in motoneuron interspike intervals. Journal of Neurophysiology, 31(4):574–587, July 1968.
  81. Spike initiation by transmembrane current: a white-noise analysis. The Journal of Physiology, 260(2):279–314, September 1976.
  82. Resonance Effect for Neural Spike Time Reliability. Journal of Neurophysiology, 80(3):1427–1438, September 1998.
  83. Reliability of spike timing in neocortical neurons. Science, 268(5216):1503–1506, June 1995.
  84. Noise, neural codes and cortical organization. Current Opinion in Neurobiology, 4(4):569–579, August 1994.
  85. The Variable Discharge of Cortical Neurons: Implications for Connectivity, Computation, and Information Coding. The Journal of Neuroscience, 18(10):3870–3896, May 1998.
  86. C. van Vreeswijk and H. Sompolinsky. Chaos in Neuronal Networks with Balanced Excitatory and Inhibitory Activity. Science, 274(5293):1724 –1726, December 1996.
  87. C. van Vreeswijk and H. Sompolinsky. Chaotic Balanced State in a Model of Cortical Circuits. Neural Computation, 10(6):1321–1371, 1998.
  88. Decorrelation of Neural-Network Activity by Inhibitory Feedback. PLOS Comput Biol, 8(8):e1002596, August 2012.
Citations (1)

Summary

We haven't generated a summary for this paper yet.