Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
158 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Efficient Online Learning for Networks of Two-Compartment Spiking Neurons (2402.15969v1)

Published 25 Feb 2024 in cs.NE

Abstract: The brain-inspired Spiking Neural Networks (SNNs) have garnered considerable research interest due to their superior performance and energy efficiency in processing temporal signals. Recently, a novel multi-compartment spiking neuron model, namely the Two-Compartment LIF (TC-LIF) model, has been proposed and exhibited a remarkable capacity for sequential modelling. However, training the TC-LIF model presents challenges stemming from the large memory consumption and the issue of gradient vanishing associated with the Backpropagation Through Time (BPTT) algorithm. To address these challenges, online learning methodologies emerge as a promising solution. Yet, to date, the application of online learning methods in SNNs has been predominantly confined to simplified Leaky Integrate-and-Fire (LIF) neuron models. In this paper, we present a novel online learning method specifically tailored for networks of TC-LIF neurons. Additionally, we propose a refined TC-LIF neuron model called Adaptive TC-LIF, which is carefully designed to enhance temporal information integration in online learning scenarios. Extensive experiments, conducted on various sequential benchmarks, demonstrate that our approach successfully preserves the superior sequential modeling capabilities of the TC-LIF neuron while incorporating the training efficiency and hardware friendliness of online learning. As a result, it offers a multitude of opportunities to leverage neuromorphic solutions for processing temporal signals.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (41)
  1. H. Markram, “The blue brain project,” Nature Reviews Neuroscience, vol. 7, no. 2, pp. 153–160, 2006.
  2. Q. Yang, Q. Liu, and H. Li, “Deep residual spiking neural network for keyword spotting in low-resource settings,” Interspeech 2022, pp. 3023–3027, 2022.
  3. J. Wu, Y. Chua, M. Zhang, H. Li, and K. C. Tan, “A spiking neural network framework for robust sound classification,” Frontiers in neuroscience, vol. 12, p. 836, 2018.
  4. J. Wu, Y. Chua, M. Zhang, G. Li, H. Li, and K. C. Tan, “A tandem learning rule for effective training and rapid inference of deep spiking neural networks,” IEEE Transactions on Neural Networks and Learning Systems, pp. 1–15, 2021.
  5. E. O. Neftci, H. Mostafa, and F. Zenke, “Surrogate gradient learning in spiking neural networks: Bringing the power of gradient-based optimization to spiking neural networks,” IEEE Signal Processing Magazine, vol. 36, no. 6, pp. 51–63, 2019.
  6. Y. Wu, L. Deng, G. Li, J. Zhu, and L. Shi, “Spatio-temporal backpropagation for training high-performance spiking neural networks,” Frontiers in Neuroscience, vol. 12, 2018.
  7. J. Wu, C. Xu, X. Han, D. Zhou, M. Zhang, H. Li, and K. C. Tan, “Progressive tandem learning for pattern recognition with deep spiking neural networks,” IEEE Transactions on Pattern Analysis and Machine Intelligence, pp. 1–1, 2021.
  8. Q. Meng, M. Xiao, S. Yan, Y. Wang, Z. Lin, and Z.-Q. Luo, “Training high-performance low-latency spiking neural networks by differentiation on spike representation,” in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2022, pp. 12 444–12 453.
  9. X. Chen, Q. Yang, J. Wu, H. Li, and K. C. Tan, “A hybrid neural coding approach for pattern recognition with spiking neural networks,” IEEE Transactions on Pattern Analysis and Machine Intelligence, pp. 1–15, 2023.
  10. J. Pei, L. Deng, S. Song, M. Zhao, Y. Zhang, S. Wu, G. Wang, Z. Zou, Z. Wu, W. He et al., “Towards artificial general intelligence with hybrid tianjic chip architecture,” Nature, vol. 572, no. 7767, pp. 106–111, 2019.
  11. M. Davies, N. Srinivasa, T.-H. Lin, G. Chinya, Y. Cao, S. H. Choday, G. Dimou, P. Joshi, N. Imam, S. Jain et al., “Loihi: A neuromorphic manycore processor with on-chip learning,” Ieee Micro, vol. 38, no. 1, pp. 82–99, 2018.
  12. P. A. Merolla, J. V. Arthur, R. Alvarez-Icaza, A. S. Cassidy, J. Sawada, F. Akopyan, B. L. Jackson, N. Imam, C. Guo, Y. Nakamura et al., “A million spiking-neuron integrated circuit with a scalable communication network and interface,” Science, vol. 345, no. 6197, pp. 668–673, 2014.
  13. R. Gütig, “Spiking neurons can discover predictive features by aggregate-label learning,” Science, vol. 351, no. 6277, p. aab4113, 2016.
  14. X. Chen, J. Wu, H. Tang, Q. Ren, and K. C. Tan, “Unleashing the potential of spiking neural networks for sequential modeling with contextual embedding,” 2023.
  15. W. Maass, “Networks of spiking neurons: the third generation of neural network models,” Neural networks, vol. 10, no. 9, pp. 1659–1671, 1997.
  16. A. N. Burkitt, “A review of the integrate-and-fire neuron model: I. homogeneous synaptic input,” Biological cybernetics, vol. 95, pp. 1–19, 2006.
  17. B. Yin, F. Corradi, and S. M. Bohté, “Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks,” Nature Machine Intelligence, vol. 3, no. 10, pp. 905–913, 2021.
  18. A. Rao, P. Plank, A. Wild, and W. Maass, “A long short-term memory for ai applications in spike-based neuromorphic hardware,” Nature Machine Intelligence, vol. 4, no. 5, pp. 467–479, 2022.
  19. A. Shaban, S. S. Bezugam, and M. Suri, “An adaptive threshold neuron for recurrent spiking neural networks with nanodevice hardware implementation,” Nature Communications, vol. 12, no. 1, p. 4234, 2021.
  20. M. Yao, H. Gao, G. Zhao, D. Wang, Y. Lin, Z. Yang, and G. Li, “Temporal-wise attention spiking neural networks for event streams classification,” in Proceedings of the IEEE/CVF International Conference on Computer Vision, 2021, pp. 10 221–10 230.
  21. ——, “Temporal-wise attention spiking neural networks for event streams classification,” in Proceedings of the IEEE/CVF International Conference on Computer Vision, 2021, pp. 10 221–10 230.
  22. L. Qin, Z. Wang, R. Yan, and H. Tang, “Attention-based deep spiking neural networks for temporal credit assignment problems,” IEEE Transactions on Neural Networks and Learning Systems, pp. 1–11, 2023.
  23. S. Zhang, Q. Yang, C. Ma, J. Wu, H. Li, and K. C. Tan, “Tc-lif: A two-compartment spiking neuron model for long-term sequential modelling,” arXiv preprint arXiv:2308.13250, 2023.
  24. H. Zheng, Y. Wu, L. Deng, Y. Hu, and G. Li, “Going deeper with directly-trained larger spiking neural networks,” in Proceedings of the AAAI conference on artificial intelligence, vol. 35, no. 12, 2021, pp. 11 062–11 070.
  25. Z. Wang, R. Jiang, S. Lian, R. Yan, and H. Tang, “Adaptive smoothing gradient learning for spiking neural networks,” in International Conference on Machine Learning.   PMLR, 2023, pp. 35 798–35 816.
  26. M. Xiao, Q. Meng, Z. Zhang, D. He, and Z. Lin, “Online training through time for spiking neural networks,” Advances in Neural Information Processing Systems, vol. 35, pp. 20 717–20 730, 2022.
  27. G. Bellec, F. Scherr, A. Subramoney, E. Hajek, D. Salaj, R. Legenstein, and W. Maass, “A solution to the learning dilemma for recurrent networks of spiking neurons,” Nature communications, vol. 11, no. 1, p. 3625, 2020.
  28. Q. Yu, C. Ma, S. Song, G. Zhang, J. Dang, and K. C. Tan, “Constructing accurate and efficient deep spiking neural networks with double-threshold and augmented schemes,” IEEE Transactions on Neural Networks and Learning Systems, vol. 33, no. 4, pp. 1714–1726, 2022.
  29. L. Manneschi and E. Vasilaki, “An alternative to backpropagation through time,” Nature Machine Intelligence, vol. 2, no. 3, pp. 155–156, 2020.
  30. C. Frenkel and G. Indiveri, “Reckon: A 28nm sub-mm2 task-agnostic spiking recurrent neural network processor enabling on-chip learning over second-long timescales,” in 2022 IEEE International Solid-State Circuits Conference (ISSCC), vol. 65.   IEEE, 2022, pp. 1–3.
  31. W. Fang, Z. Yu, Y. Chen, T. Masquelier, T. Huang, and Y. Tian, “Incorporating learnable membrane time constant to enhance learning of spiking neural networks,” in Proceedings of the IEEE/CVF international conference on computer vision, 2021, pp. 2661–2671.
  32. X. Yao, F. Li, Z. Mo, and J. Cheng, “Glif: A unified gated leaky integrate-and-fire neuron for spiking neural networks,” Advances in Neural Information Processing Systems, vol. 35, pp. 32 160–32 171, 2022.
  33. B. Yin, F. Corradi, and S. M. Bohté, “Effective and efficient computation with multiple-timescale spiking recurrent neural networks,” in International Conference on Neuromorphic Systems 2020, 2020, pp. 1–8.
  34. P. J. Werbos, “Backpropagation through time: what it does and how to do it,” Proceedings of the IEEE, vol. 78, no. 10, pp. 1550–1560, 1990.
  35. S. Deng, Y. Li, S. Zhang, and S. Gu, “Temporal efficient training of spiking neural network via gradient re-weighting,” arXiv preprint arXiv:2202.11946, 2022.
  36. A. Bernacchia, H. Seo, D. Lee, and X.-J. Wang, “A reservoir of time constants for memory traces in cortical neurons,” Nature neuroscience, vol. 14, no. 3, pp. 366–372, 2011.
  37. C. Ma, R. Yan, Z. Yu, and Q. Yu, “Deep spike learning with local classifiers,” IEEE Transactions on Cybernetics, 2022.
  38. Q. V. Le, N. Jaitly, and G. E. Hinton, “A simple way to initialize recurrent networks of rectified linear units,” arXiv preprint arXiv:1504.00941, 2015.
  39. B. Cramer, Y. Stradmann, J. Schemmel, and F. Zenke, “The heidelberg spiking data sets for the systematic evaluation of spiking neural networks,” IEEE Transactions on Neural Networks and Learning Systems, vol. 33, no. 7, pp. 2744–2757, 2020.
  40. H. Robbins and S. Monro, “A stochastic approximation method,” The annals of mathematical statistics, pp. 400–407, 1951.
  41. D. P. Kingma and J. Ba, “Adam: A method for stochastic optimization,” arXiv preprint arXiv:1412.6980, 2014.

Summary

We haven't generated a summary for this paper yet.