Parallel Spiking Unit for Efficient Training of Spiking Neural Networks (2402.00449v3)
Abstract: Efficient parallel computing has become a pivotal element in advancing artificial intelligence. Yet, the deployment of Spiking Neural Networks (SNNs) in this domain is hampered by their inherent sequential computational dependency. This constraint arises from the need for each time step's processing to rely on the preceding step's outcomes, significantly impeding the adaptability of SNN models to massively parallel computing environments. Addressing this challenge, our paper introduces the innovative Parallel Spiking Unit (PSU) and its two derivatives, the Input-aware PSU (IPSU) and Reset-aware PSU (RPSU). These variants skillfully decouple the leaky integration and firing mechanisms in spiking neurons while probabilistically managing the reset process. By preserving the fundamental computational attributes of the spiking neuron model, our approach enables the concurrent computation of all membrane potential instances within the SNN, facilitating parallel spike output generation and substantially enhancing computational efficiency. Comprehensive testing across various datasets, including static and sequential images, Dynamic Vision Sensor (DVS) data, and speech datasets, demonstrates that the PSU and its variants not only significantly boost performance and simulation speed but also augment the energy efficiency of SNNs through enhanced sparsity in neural activity. These advancements underscore the potential of our method in revolutionizing SNN deployment for high-performance parallel computing applications.
- W. Maass, “Networks of spiking neurons: the third generation of neural network models,” Neural networks, vol. 10, no. 9, pp. 1659–1671, 1997.
- K. Roy, A. Jaiswal, and P. Panda, “Towards spike-based machine intelligence with neuromorphic computing,” Nature, vol. 575, no. 7784, pp. 607–617, 2019.
- N. Rathi, I. Chakraborty, A. Kosta, A. Sengupta, A. Ankit, P. Panda, and K. Roy, “Exploring neuromorphic computing based on spiking neural networks: Algorithms to hardware,” ACM Computing Surveys, vol. 55, no. 12, pp. 1–49, 2023.
- Z. Zhou, Y. Zhu, C. He, Y. Wang, S. Yan, Y. Tian, and L. Yuan, “Spikformer: When spiking neural network meets transformer,” arXiv preprint arXiv:2209.15425, 2022.
- J. Pei, L. Deng, S. Song, M. Zhao, Y. Zhang, S. Wu, G. Wang, Z. Zou, Z. Wu, W. He et al., “Towards artificial general intelligence with hybrid tianjic chip architecture,” Nature, vol. 572, no. 7767, pp. 106–111, 2019.
- P. U. Diehl, G. Zarrella, A. Cassidy, B. U. Pedroni, and E. Neftci, “Conversion of artificial recurrent neural networks to spiking neural networks for low-power neuromorphic hardware,” in 2016 IEEE International Conference on Rebooting Computing (ICRC). IEEE, 2016, pp. 1–8.
- Y. Li, D. Zhao, and Y. Zeng, “Bsnn: Towards faster and better conversion of artificial neural networks to spiking neural networks with bistable neurons,” Frontiers in neuroscience, vol. 16, p. 991851, 2022.
- X. He, Y. Li, D. Zhao, Q. Kong, and Y. Zeng, “Msat: Biologically inspired multi-stage adaptive threshold for conversion of spiking neural networks,” arXiv preprint arXiv:2303.13080, 2023.
- Y. Wu, L. Deng, G. Li, J. Zhu, and L. Shi, “Spatio-temporal backpropagation for training high-performance spiking neural networks,” Frontiers in neuroscience, vol. 12, p. 331, 2018.
- S. B. Shrestha and G. Orchard, “Slayer: Spike layer error reassignment in time,” Advances in neural information processing systems, vol. 31, 2018.
- Y. Zeng, T. Zhang, and B. Xu, “Improving multi-layer spiking neural networks by incorporating brain-inspired rules,” Sci. China Inf. Sci, vol. 60, no. 5, pp. 1–11, 2017.
- D. Zhao, Y. Li, Y. Zeng, J. Wang, and Q. Zhang, “Spiking capsnet: A spiking neural network with a biologically plausible routing rule between capsules,” Information Sciences, vol. 610, pp. 1–13, 2022.
- B. Han, F. Zhao, Y. Zeng, W. Pan, and G. Shen, “Enhancing efficient continual learning with dynamic structure development of spiking neural networks,” arXiv preprint arXiv:2308.04749, 2023.
- Z. Zhou, K. Che, W. Fang, K. Tian, Y. Zhu, S. Yan, Y. Tian, and L. Yuan, “Spikformer v2: Join the high accuracy club on imagenet with an snn ticket,” arXiv preprint arXiv:2401.02020, 2024.
- S. Kim, S. Park, B. Na, and S. Yoon, “Spiking-yolo: spiking neural network for energy-efficient object detection,” in Proceedings of the AAAI conference on artificial intelligence, vol. 34, no. 07, 2020, pp. 11 270–11 277.
- Y. Li, X. He, Y. Dong, Q. Kong, and Y. Zeng, “Spike calibration: Fast and accurate conversion of spiking neural network for object detection and segmentation,” arXiv preprint arXiv:2207.02702, 2022.
- W. Fang, Z. Yu, Y. Chen, T. Masquelier, T. Huang, and Y. Tian, “Incorporating learnable membrane time constant to enhance learning of spiking neural networks,” in Proceedings of the IEEE/CVF international conference on computer vision, 2021, pp. 2661–2671.
- J. Ding, B. Dong, F. Heide, Y. Ding, Y. Zhou, B. Yin, and X. Yang, “Biologically inspired dynamic thresholds for spiking neural networks,” Advances in Neural Information Processing Systems, vol. 35, pp. 6090–6103, 2022.
- S. Y. A. Yarga and S. U. Wood, “Accelerating snn training with stochastic parallelizable spiking neurons,” arXiv preprint arXiv:2306.12666, 2023.
- W. Fang, Z. Yu, Z. Zhou, D. Chen, Y. Chen, Z. Ma, T. Masquelier, and Y. Tian, “Parallel spiking neurons with high efficiency and ability to learn long-term dependencies,” in Thirty-seventh Conference on Neural Information Processing Systems, 2023.
- P. U. Diehl and M. Cook, “Unsupervised learning of digit recognition using spike-timing-dependent plasticity,” Frontiers in computational neuroscience, vol. 9, p. 99, 2015.
- H. Markram, W. Gerstner, and P. J. Sjöström, “A history of spike-timing-dependent plasticity,” Frontiers in synaptic neuroscience, vol. 3, p. 4, 2011.
- T. Zhang, X. Cheng, S. Jia, M.-m. Poo, Y. Zeng, and B. Xu, “Self-backpropagation of synaptic modifications elevates the efficiency of spiking and artificial neural networks,” Science advances, vol. 7, no. 43, p. eabh0146, 2021.
- Y. Li and Y. Zeng, “Efficient and accurate conversion of spiking neural network with burst spikes,” arXiv preprint arXiv:2204.13271, 2022.
- T. Bu, W. Fang, J. Ding, P. Dai, Z. Yu, and T. Huang, “Optimal ann-snn conversion for high-accuracy and ultra-low-latency spiking neural networks,” arXiv preprint arXiv:2303.04347, 2023.
- K. Suetake, S.-i. Ikegawa, R. Saiin, and Y. Sawada, “S2nn: Time step reduction of spiking surrogate gradients for training energy efficient single-step neural networks,” arXiv preprint arXiv:2201.10879, 2022.
- Y. Li, Y. Guo, S. Zhang, S. Deng, Y. Hai, and S. Gu, “Differentiable spike: Rethinking gradient-descent for training spiking neural networks,” Advances in Neural Information Processing Systems, vol. 34, pp. 23 426–23 439, 2021.
- L. Herranz-Celotti and J. Rouat, “Surrogate gradients design,” arXiv preprint arXiv:2202.00282, 2022.
- S. R. Kheradpisheh, M. Mirsadeghi, and T. Masquelier, “Spiking neural networks trained via proxy,” IEEE Access, vol. 10, pp. 70 769–70 778, 2022.
- Y. Wang, M. Zhang, Y. Chen, and H. Qu, “Signed neuron with memory: Towards simple, accurate and high-efficient ann-snn conversion,” in International Joint Conference on Artificial Intelligence, 2022.
- Z. Wu, H. Zhang, Y. Lin, G. Li, M. Wang, and Y. Tang, “Liaf-net: Leaky integrate and analog fire network for lightweight and efficient spatiotemporal information processing,” IEEE Transactions on Neural Networks and Learning Systems, vol. 33, no. 11, pp. 6249–6262, 2021.
- W. Ponghiran and K. Roy, “Spiking neural networks with improved inherent recurrence dynamics for sequential learning,” in Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, no. 7, 2022, pp. 8001–8008.
- A. Lotfi Rezaabad and S. Vishwanath, “Long short-term memory spiking networks and their applications,” in International Conference on Neuromorphic Systems 2020, 2020, pp. 1–9.
- A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A. N. Gomez, Ł. Kaiser, and I. Polosukhin, “Attention is all you need,” Advances in neural information processing systems, vol. 30, 2017.
- Q. Meng, M. Xiao, S. Yan, Y. Wang, Z. Lin, and Z.-Q. Luo, “Towards memory-and time-efficient backpropagation for training spiking neural networks,” arXiv preprint arXiv:2302.14311, 2023.
- B. Han, G. Srinivasan, and K. Roy, “Rmp-snn: Residual membrane potential neuron for enabling deeper high-accuracy and low-latency spiking neural network,” in Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, 2020, pp. 13 558–13 567.
- Y. Zeng, D. Zhao, F. Zhao, G. Shen, Y. Dong, E. Lu, Q. Zhang, Y. Sun, Q. Liang, Y. Zhao et al., “Braincog: A spiking neural network based, brain-inspired cognitive intelligence engine for brain-inspired ai and brain simulation,” Patterns, vol. 4, no. 8, 2023.
- Y. Guo, X. Tong, Y. Chen, L. Zhang, X. Liu, Z. Ma, and X. Huang, “Recdis-snn: Rectifying membrane potential distribution for directly training spiking neural networks,” in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2022, pp. 326–335.
- S. Deng, Y. Li, S. Zhang, and S. Gu, “Temporal efficient training of spiking neural network via gradient re-weighting,” arXiv preprint arXiv:2202.11946, 2022.
- E. O. Neftci, H. Mostafa, and F. Zenke, “Surrogate gradient learning in spiking neural networks: Bringing the power of gradient-based optimization to spiking neural networks,” IEEE Signal Processing Magazine, vol. 36, no. 6, pp. 51–63, 2019.
- Z. Wang, R. Jiang, S. Lian, R. Yan, and H. Tang, “Adaptive smoothing gradient learning for spiking neural networks,” in International Conference on Machine Learning. PMLR, 2023, pp. 35 798–35 816.
- N. Perez-Nieves, V. C. Leung, P. L. Dragotti, and D. F. Goodman, “Neural heterogeneity promotes robust learning,” Nature communications, vol. 12, no. 1, p. 5791, 2021.
- M. Yao, H. Gao, G. Zhao, D. Wang, Y. Lin, Z. Yang, and G. Li, “Temporal-wise attention spiking neural networks for event streams classification,” in Proceedings of the IEEE/CVF International Conference on Computer Vision, 2021, pp. 10 221–10 230.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.