Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Deep Pulse-Coupled Neural Networks (2401.08649v1)

Published 24 Dec 2023 in cs.NE and cs.LG

Abstract: Spiking Neural Networks (SNNs) capture the information processing mechanism of the brain by taking advantage of spiking neurons, such as the Leaky Integrate-and-Fire (LIF) model neuron, which incorporates temporal dynamics and transmits information via discrete and asynchronous spikes. However, the simplified biological properties of LIF ignore the neuronal coupling and dendritic structure of real neurons, which limits the spatio-temporal dynamics of neurons and thus reduce the expressive power of the resulting SNNs. In this work, we leverage a more biologically plausible neural model with complex dynamics, i.e., a pulse-coupled neural network (PCNN), to improve the expressiveness and recognition performance of SNNs for vision tasks. The PCNN is a type of cortical model capable of emulating the complex neuronal activities in the primary visual cortex. We construct deep pulse-coupled neural networks (DPCNNs) by replacing commonly used LIF neurons in SNNs with PCNN neurons. The intra-coupling in existing PCNN models limits the coupling between neurons only within channels. To address this limitation, we propose inter-channel coupling, which allows neurons in different feature maps to interact with each other. Experimental results show that inter-channel coupling can efficiently boost performance with fewer neurons, synapses, and less training time compared to widening the networks. For instance, compared to the LIF-based SNN with wide VGG9, DPCNN with VGG9 uses only 50%, 53%, and 73% of neurons, synapses, and training time, respectively. Furthermore, we propose receptive field and time dependent batch normalization (RFTD-BN) to speed up the convergence and performance of DPCNNs.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (46)
  1. K. Roy, A. Jaiswal, and P. Panda, “Towards spike-based machine intelligence with neuromorphic computing,” Nature, vol. 575, no. 7784, pp. 607–617, Nov. 2019.
  2. G. Li, L. Deng, H. Tang, G. Pan, Y. Tian, K. Roy, and W. Maass, “Brain Inspired Computing: A Systematic Survey and Future Trends,” preprint, Jan. 2023.
  3. Z. Yi, J. Lian, Q. Liu, H. Zhu, D. Liang, and J. Liu, “Learning rules in spiking neural networks: A survey,” Neurocomputing, vol. 531, pp. 163–179, Apr. 2023.
  4. W. Fang, Z. Yu, Y. Chen, T. Masquelier, T. Huang, and Y. Tian, “Incorporating learnable membrane time constant to enhance learning of spiking neural networks,” in 2021 IEEE/CVF International Conference on Computer Vision (ICCV), Oct. 2021, pp. 2641–2651.
  5. H. Sun, W. Cai, B. Yang, Y. Cui, Y. Xia, D. Yao, and D. Guo, “A synapse-threshold synergistic learning approach for spiking neural networks,” IEEE Transactions on Cognitive and Developmental Systems, pp. 1–1, 2023.
  6. S. Wang, T. H. Cheng, and M.-H. Lim, “LTMD: Learning improvement of spiking neural networks with learnable thresholding neurons and moderate dropout,” in Advances in Neural Information Processing Systems, 2022.
  7. N. Rathi and K. Roy, “DIET-SNN: A low-latency spiking neural network with direct input encoding and leakage and threshold optimization,” IEEE Transactions on Neural Networks and Learning Systems, pp. 1–9, 2021.
  8. X. Cheng, Y. Hao, J. Xu, and B. Xu, “LISNN: Improving spiking neural networks with lateral interactions for robust object recognition,” in Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence, Jul. 2020, pp. 1519–1525.
  9. R. Eckhorn, H. J. Reitboeck, M. Arndt, and P. Dicke, “Feature linking via synchronization among distributed assemblies: Simulations of results from cat visual cortex,” Neural Computation, vol. 2, no. 3, pp. 293–307, Sep. 1990.
  10. J. Johnson and M. Padgett, “PCNN models and applications,” IEEE Transactions on Neural Networks, vol. 10, no. 3, pp. 480–498, May 1999.
  11. H. Zheng, Y. Wu, L. Deng, Y. Hu, and G. Li, “Going deeper with directly-trained larger spiking neural networks,” Proceedings of the AAAI Conference on Artificial Intelligence, vol. 35, no. 12, pp. 11 062–11 070, May 2021.
  12. K. He, X. Zhang, S. Ren, and J. Sun, “Deep residual learning for image recognition,” in Proceedings of the IEEE conference on computer vision and pattern recognition, 2016, pp. 770–778.
  13. W. Fang, Z. Yu, Y. Chen, T. Huang, T. Masquelier, and Y. Tian, “Deep residual learning in spiking neural networks,” in Advances in Neural Information Processing Systems, vol. 34, 2021, pp. 21 056–21 069.
  14. Y. Hu, Y. Wu, L. Deng, and G. Li, “Advancing residual learning towards powerful deep spiking neural networks,” Dec. 2021, arXiv:2112.08954 [cs].
  15. M. Yao, G. Zhao, H. Zhang, Y. Hu, L. Deng, Y. Tian, B. Xu, and G. Li, “Attention Spiking Neural Networks,” IEEE Transactions on Pattern Analysis and Machine Intelligence, pp. 1–18, 2023.
  16. Y. Cao, Y. Chen, and D. Khosla, “Spiking deep convolutional neural networks for energy-efficient object recognition,” International Journal of Computer Vision, vol. 113, no. 1, pp. 54–66, May 2015.
  17. B. Rueckauer, I.-A. Lungu, Y. Hu, M. Pfeiffer, and S.-C. Liu, “Conversion of continuous-valued deep networks to efficient event-driven networks forimage classification,” Frontiers in Neuroscience, vol. 11, p. 682, Dec. 2017.
  18. B. Han, G. Srinivasan, and K. Roy, “RMP-SNN: Residual membrane potential neuron for enabling deeper high-accuracy and low-latency spiking neural network,” in 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2020, pp. 13 555–13 564.
  19. Y. Hu, H. Tang, and G. Pan, “Spiking deep residual networks,” IEEE Transactions on Neural Networks and Learning Systems, pp. 1–6, 2021.
  20. Y. Li, S. Deng, X. Dong, R. Gong, and S. Gu, “A free lunch from ann: Towards efficient, accurate spiking neural networks,” in Proceedings of the 38th International Conference on Machine Learning, ser. Proceedings of Machine Learning Research, vol. 139, Jul. 2021, pp. 6316–6325.
  21. T. Bu, W. Fang, J. Ding, P. DAI, Z. Yu, and T. Huang, “Optimal ANN-SNN conversion for high-accuracy and ultra-low-latency spiking neural networks,” in International Conference on Learning Representations, 2022.
  22. K. Zhan, H. Zhang, and Y. Ma, “New spiking cortical model for invariant texture retrieval and image processing,” IEEE Transactions on Neural Networks, vol. 20, no. 12, pp. 1980–1986, 2009.
  23. Y. Chen, S.-K. Park, Y. Ma, and R. Ala, “A new automatic parameter setting method of a simplified pcnn for image segmentation,” IEEE transactions on neural networks, vol. 22, no. 6, pp. 880–892, 2011.
  24. Y. Huang, Y. Ma, S. Li, and K. Zhan, “Application of heterogeneous pulse coupled neural network in image quantization,” Journal of Electronic Imaging, vol. 25, no. 6, pp. 061 603–061 603, 2016.
  25. J. Liu, J. Lian, J. C. Sprott, Q. Liu, and Y. Ma, “The butterfly effect in primary visual cortex,” IEEE Transactions on Computers, pp. 1–1, 2022.
  26. Z. Wang, X. Sun, Y. Zhang, Z. Ying, and Y. Ma, “Leaf recognition based on pcnn,” Neural Computing and Applications, vol. 27, pp. 899–908, 2016.
  27. K. Simonyan and A. Zisserman, “Very deep convolutional networks for large-scale image recognition,” arXiv preprint arXiv:1409.1556, 2014.
  28. M. Winding, B. D. Pedigo, C. L. Barnes, H. G. Patsolic, Y. Park, T. Kazimiers, A. Fushiki, I. V. Andrade, A. Khandelwal, J. Valdes-Aleman et al., “The connectome of an insect brain,” Science, vol. 379, no. 6636, p. eadd9330, 2023.
  29. K. Zhan, J. Shi, H. Wang, Y. Xie, and Q. Li, “Computational mechanisms of pulse-coupled neural networks: A comprehensive review,” Archives of Computational Methods in Engineering, vol. 24, no. 3, pp. 573–588, Jul. 2017.
  30. Z. Wang, Y. Ma, F. Cheng, and L. Yang, “Review of pulse-coupled neural networks,” Image and Vision Computing, vol. 28, no. 1, pp. 5–13, Jan. 2010.
  31. J. Lian, Z. Yang, J. Liu, W. Sun, L. Zheng, X. Du, Z. Yi, B. Shi, and Y. Ma, “An overview of image segmentation based on pulse-coupled neural network,” Archives of Computational Methods in Engineering, vol. 28, no. 2, pp. 387–403, Mar. 2021.
  32. S. Ioffe and C. Szegedy, “Batch normalization: Accelerating deep network training by reducing internal covariate shift,” in International conference on machine learning, 2015, pp. 448–456.
  33. Y. Kim and P. Panda, “Revisiting batch normalization for training low-latency deep spiking neural networks from scratch,” Frontiers in Neuroscience, vol. 15, p. 773954, Dec. 2021.
  34. Y. Wu, L. Deng, G. Li, J. Zhu, and L. Shi, “Spatio-temporal backpropagation for training high-performance spiking neural networks,” Frontiers in Neuroscience, vol. 12, p. 331, May 2018.
  35. S. Deng, Y. Li, S. Zhang, and S. Gu, “Temporal efficient training of spiking neural network via gradient re-weighting,” in International Conference on Learning Representations, 2022.
  36. Y. Lecun, L. Bottou, Y. Bengio, and P. Haffner, “Gradient-based learning applied to document recognition,” Proceedings of the IEEE, vol. 86, no. 11, pp. 2278–2324, Nov. 1998.
  37. G. Orchard, A. Jayawant, G. K. Cohen, and N. Thakor, “Converting Static Image Datasets to Spiking Neuromorphic Datasets Using Saccades,” Frontiers in Neuroscience, vol. 9, Nov. 2015.
  38. W. Fang, Y. Chen, J. Ding, D. Chen, Z. Yu, H. Zhou, Y. Tian et al., “Spikingjelly,” Multimedia Learn. Group, Inst. Digit. Media (NELVT), 2020.
  39. H. Xiao, K. Rasul, and R. Vollgraf, “Fashion-mnist: a novel image dataset for benchmarking machine learning algorithms,” arXiv preprint arXiv:1708.07747, 2017.
  40. A. Krizhevsky, V. Nair, and G. Hinton, “The CIFAR-10 dataset,” online: http://www. cs. toronto. edu/kriz/cifar. html, vol. 55, no. 5, 2014.
  41. D. P. Kingma and J. Ba, “Adam: A method for stochastic optimization,” arXiv preprint arXiv:1412.6980, 2014.
  42. I. Loshchilov and F. Hutter, “Sgdr: Stochastic gradient descent with warm restarts,” arXiv preprint arXiv:1608.03983, 2016.
  43. W. Zhang and P. Li, “Temporal spike sequence learning via backpropagation for deep spiking neural networks,” in Advances in Neural Information Processing Systems, vol. 33, 2020, pp. 12 022–12 033.
  44. C. Duan, J. Ding, S. Chen, Z. Yu, and T. Huang, “Temporal effective batch normalization in spiking neural networks,” in Advances in Neural Information Processing Systems, 2022.
  45. T. Bu, J. Ding, Z. Yu, and T. Huang, “Optimized potential initialization for low-latency spiking neural networks,” Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, no. 1, pp. 11–20, Jun. 2022.
  46. Q. Yang, J. Wu, M. Zhang, Y. Chua, X. Wang, and H. Li, “Training spiking neural networks with local tandem learning,” in Advances in Neural Information Processing Systems, 2022.
Citations (1)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets