Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 57 tok/s
Gemini 2.5 Pro 52 tok/s Pro
GPT-5 Medium 20 tok/s Pro
GPT-5 High 19 tok/s Pro
GPT-4o 93 tok/s Pro
Kimi K2 176 tok/s Pro
GPT OSS 120B 449 tok/s Pro
Claude Sonnet 4.5 35 tok/s Pro
2000 character limit reached

Consciousness Driven Spike Timing Dependent Plasticity (2405.04546v1)

Published 4 May 2024 in q-bio.NC

Abstract: Spiking Neural Networks (SNNs), recognized for their biological plausibility and energy efficiency, employ sparse and asynchronous spikes for communication. However, the training of SNNs encounters difficulties coming from non-differentiable activation functions and the movement of spike-based inter-layer data. Spike-Timing Dependent Plasticity (STDP), inspired by neurobiology, plays a crucial role in SNN's learning, but its still lacks the conscious part of the brain used for learning. Considering the issue, this research work proposes a Consciousness Driven STDP (CD-STDP), an improved solution addressing inherent limitations observed in conventional STDP models. CD-STDP, designed to infuse the conscious part as coefficients of long-term potentiation (LTP) and long-term depression (LTD), exhibit a dynamic nature. The model connects LTP and LTD coefficients to current and past state of synaptic activities, respectively, enhancing consciousness and adaptability. This consciousness empowers the model to effectively learn while understanding the input patterns. The conscious coefficient adjustment in response to current and past synaptic activity extends the model's conscious and other cognitive capabilities, offering a refined and efficient approach for real-world applications. Evaluations on MNIST, FashionMNIST and CALTECH datasets showcase $CD$-STDP's remarkable accuracy of 98.6%, 85.61% and 99.0%, respectively, in a single hidden layer SNN. In addition, analysis of conscious elements and consciousness of the proposed model on SNN is performed.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (55)
  1. H. J. Vishnukumar, B. Butting, C. Müller, and E. Sax, “Machine learning and deep neural network—artificial intelligence core for lab and real-world test and validation for adas and autonomous vehicles: Ai for efficient and quality test and validation,” in 2017 Intelligent Systems Conference (IntelliSys).   IEEE, 2017, pp. 714–721.
  2. S. Woźniak, A. Pantazi, T. Bohnstingl, and E. Eleftheriou, “Deep learning incorporating biologically inspired neural dynamics and in-memory computing,” Nature Machine Intelligence, vol. 2, no. 6, pp. 325–336, 2020.
  3. A. Vigneron and J. Martinet, “A critical survey of stdp in spiking neural networks for pattern recognition,” in 2020 International Joint Conference on Neural Networks (IJCNN).   IEEE, 2020, pp. 1–9.
  4. G. Friedman, K. W. Turk, and A. E. Budson, “The current of consciousness: Neural correlates and clinical aspects,” Current Neurology and Neuroscience Reports, vol. 23, no. 7, pp. 345–352, 2023.
  5. B. J. Baars, N. Geld, and R. Kozma, “Global workspace theory (gwt) and prefrontal cortex: Recent developments,” Frontiers in Psychology, vol. 12, p. 749868, 2021.
  6. R. J. Gennaro, “Higher-order theories of consciousness: An overview,” ADVANCES IN CONSCIOUSNESS RESEARCH, vol. 56, pp. 1–16, 2004.
  7. G. Tononi, “An information integration theory of consciousness,” BMC Neuroscience, vol. 5, pp. 1–22, 2004.
  8. P. V. Bundzen, K. G. Korotkov, and L.-E. Unestahl, “Altered states of consciousness: review of experimental data obtained with a multiple techniques approach,” The Journal of Alternative & Complementary Medicine, vol. 8, no. 2, pp. 153–165, 2002.
  9. M. J. Young, Y. G. Bodien, and B. L. Edlow, “Ethical considerations in clinical trials for disorders of consciousness,” Brain Sciences, vol. 12, no. 2, p. 211, 2022.
  10. G. Tononi, “Consciousness as integrated information: a provisional manifesto,” The Biological Bulletin, vol. 215, no. 3, pp. 216–242, 2008.
  11. F. Paredes-Vallés, K. Y. Scheper, and G. C. De Croon, “Unsupervised learning of a hierarchical spiking neural network for optical flow estimation: From events to global motion perception,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 42, no. 8, pp. 2051–2064, 2019.
  12. S. Sengupta, K. S. Gurumoorthy, and A. Banerjee, “Sensitivity analysis for additive stdp rule,” arXiv preprint arXiv:1503.07490, 2015.
  13. J. Sjöström, W. Gerstner et al., “Spike-timing dependent plasticity,” Spike-Timing Dependent Plasticity, vol. 35, no. 0, pp. 0–0, 2010.
  14. O. Bichler, D. Querlioz, S. J. Thorpe, J.-P. Bourgoin, and C. Gamrat, “Extraction of temporally correlated features from dynamic vision sensors with spike-timing-dependent plasticity,” Neural Networks, vol. 32, pp. 339–348, 2012.
  15. K. S. Burbank, “Mirrored stdp implements autoencoder learning in a network of spiking neurons,” PLoS Computational Biology, vol. 11, no. 12, p. e1004566, 2015.
  16. A. Tavanaei, T. Masquelier, and A. S. Maida, “Acquisition of visual features through probabilistic spike-timing-dependent plasticity,” in 2016 International Joint Conference on Neural Networks (IJCNN).   IEEE, 2016, pp. 307–314.
  17. M. Mozafari, S. R. Kheradpisheh, T. Masquelier, A. Nowzari-Dalini, and M. Ganjtabesh, “First-spike-based visual categorization using reward-modulated stdp,” IEEE Transactions on Neural Networks and Learning Systems, vol. 29, no. 12, pp. 6178–6190, 2018.
  18. K. S. Burbank and G. Kreiman, “Depression-biased reverse plasticity rule is required for stable learning at top-down connections,” PLoS Computational Biology, vol. 8, no. 3, p. e1002393, 2012.
  19. A. J. Talaei, “Pattern recognition using spiking neural networks,” Ph.D. dissertation, University of Windsor (Canada), 2020.
  20. D. Krunglevicius, “Modified stdp triplet rule significantly increases neuron training stability in the learning of spatial patterns,” Advances in Artificial Neural Systems, vol. 2016, 2016.
  21. N. Garg, I. Balafrej, T. C. Stewart, J.-M. Portal, M. Bocquet, D. Querlioz, D. Drouin, J. Rouat, Y. Beilliard, and F. Alibart, “Voltage-dependent synaptic plasticity: Unsupervised probabilistic hebbian plasticity rule based on neurons membrane potential,” Frontiers in Neuroscience, vol. 16, p. 983950, 2022.
  22. G. Mandler, “Consciousness recovered,” Consciousness Recovered, pp. 1–152, 2002.
  23. G. Tononi, “The integrated information theory of consciousness: an updated account,” Archives italiennes de biologie, vol. 150, no. 2/3, pp. 56–90, 2012.
  24. K. Ahmed, M. Habib, and C. Martín-Gómez, “Brain-inspired spiking neural networks,” in Biomimetics.   IntechOpen, 2020, pp. 1–25.
  25. K. Yamazaki, V.-K. Vo-Ho, D. Bulsara, and N. Le, “Spiking neural networks and their applications: A review,” Brain Sciences, vol. 12, no. 7, p. 863, 2022.
  26. S. Yadav, S. Chaudhary, and R. Kumar, “Comparative analysis of biological spiking neuron models for classification task,” in 2023 14th International Conference on Computing Communication and Networking Technologies (ICCCNT).   IEEE, 2023, pp. 1–6.
  27. C. Lee, P. Panda, G. Srinivasan, and K. Roy, “Training deep spiking convolutional neural networks with stdp-based unsupervised pre-training followed by supervised fine-tuning,” Frontiers in Neuroscience, vol. 12, p. 435, 2018.
  28. T. Griffith, J. Mellor, and K. Tsaneva-Atanasova, “Spike-timing dependent plasticity (stdp), biophysical models,” in Encyclopedia of Computational Neuroscience.   Springer, 2022, pp. 3258–3262.
  29. G. Tononi, “Information integration: its relevance to brain function and consciousness,” Archives Italiennes De Biologie, vol. 148, no. 3, pp. 299–322, 2010.
  30. K. S. Gansel, “Neural synchrony in cortical networks: mechanisms and implications for neural information processing and coding,” Frontiers in Integrative Neuroscience, vol. 16, p. 900715, 2022.
  31. L. Zhang, S. Zhou, T. Zhi, Z. Du, and Y. Chen, “Tdsnn: From deep neural networks to deep spike neural networks with temporal-coding,” in Proceedings of the AAAI Conference on Artificial Intelligence, vol. 33, no. 01, 2019, pp. 1319–1326.
  32. M. Dhamala, G. Rangarajan, and M. Ding, “Analyzing information flow in brain networks with nonparametric granger causality,” Neuroimage, vol. 41, no. 2, pp. 354–362, 2008.
  33. C. Lee, G. Srinivasan, P. Panda, and K. Roy, “Deep spiking convolutional neural network trained with unsupervised spike-timing-dependent plasticity,” IEEE Transactions on Cognitive and Developmental Systems, vol. 11, no. 3, pp. 384–394, 2018.
  34. J. K. Eshraghian, M. Ward, E. O. Neftci, X. Wang, G. Lenz, G. Dwivedi, M. Bennamoun, D. S. Jeong, and W. D. Lu, “Training spiking neural networks using lessons from deep learning,” Proceedings of the IEEE, 2023.
  35. Q. Yang, M. Zhang, J. Wu, K. C. Tan, and H. Li, “Lc-ttfs: Towards lossless network conversion for spiking neural networks with ttfs coding,” IEEE Transactions on Cognitive and Developmental Systems, 2023.
  36. D. Querlioz, O. Bichler, P. Dollfus, and C. Gamrat, “Immunity to device variations in a spiking neural network with memristive nanodevices,” IEEE Transactions on Nanotechnology, vol. 12, no. 3, pp. 288–295, 2013.
  37. P. U. Diehl, D. Neil, J. Binas, M. Cook, S.-C. Liu, and M. Pfeiffer, “Fast-classifying, high-accuracy spiking deep networks through weight and threshold balancing,” in 2015 International Joint Conference on Neural Networks (IJCNN).   ieee, 2015, pp. 1–8.
  38. Y. Hao, X. Huang, M. Dong, and B. Xu, “A biologically plausible supervised learning method for spiking neural networks using the symmetric stdp rule,” Neural Networks, vol. 121, pp. 387–395, 2020.
  39. M. Mozafari, M. Ganjtabesh, A. Nowzari-Dalini, S. J. Thorpe, and T. Masquelier, “Bio-inspired digit recognition using reward-modulated spike-timing-dependent plasticity in deep convolutional networks,” Pattern Recognition, vol. 94, pp. 87–95, 2019.
  40. P. Falez, P. Tirilly, I. M. Bilasco, P. Devienne, and P. Boulet, “Multi-layered spiking neural network with target timestamp threshold adaptation and stdp,” in 2019 International Joint Conference on Neural Networks (IJCNN).   IEEE, 2019, pp. 1–8.
  41. S. R. Kheradpisheh, M. Ganjtabesh, S. J. Thorpe, and T. Masquelier, “Stdp-based spiking deep convolutional neural networks for object recognition,” Neural Networks, vol. 99, pp. 56–67, 2018.
  42. A. Tavanaei and A. S. Maida, “Multi-layer unsupervised learning in a spiking convolutional neural network,” in 2017 International Joint Conference on Neural Networks (IJCNN).   IEEE, 2017, pp. 2023–2030.
  43. P. Ferré, F. Mamalet, and S. J. Thorpe, “Unsupervised feature learning with winner-takes-all based stdp,” Frontiers in Computational Neuroscience, vol. 12, p. 24, 2018.
  44. Y. Li, D. Zhao, and Y. Zeng, “Bsnn: Towards faster and better conversion of artificial neural networks to spiking neural networks with bistable neurons,” Frontiers in Neuroscience, vol. 16, p. 991851, 2022.
  45. T. Zhang, Y. Zeng, D. Zhao, and M. Shi, “A plasticity-centric approach to train the non-differential spiking neural networks,” in Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32, no. 1, 2018.
  46. M. Shi, T. Zhang, and Y. Zeng, “A curiosity-based learning method for spiking neural networks,” Frontiers in Computational Neuroscience, vol. 14, p. 7, 2020.
  47. A. Tavanaei and A. Maida, “Bp-stdp: Approximating backpropagation using spike timing dependent plasticity,” Neurocomputing, vol. 330, pp. 39–47, 2019.
  48. D. Zhao, Y. Zeng, T. Zhang, M. Shi, and F. Zhao, “Glsnn: A multi-layer spiking neural network based on global feedback alignment and local stdp plasticity,” Frontiers in Computational Neuroscience, vol. 14, p. 576841, 2020.
  49. Y. Dong, D. Zhao, Y. Li, and Y. Zeng, “An unsupervised stdp-based spiking neural network inspired by biologically plausible learning rules and connections,” Neural Networks, vol. 165, pp. 799–808, 2023.
  50. Z. Cai, H. R. Kalatehbali, B. Walters, M. R. Azghadi, A. Amirsoleimani, and R. Genov, “Spike timing dependent gradient for direct training of fast and efficient binarized spiking neural networks,” IEEE Journal on Emerging and Selected Topics in Circuits and Systems, 2023.
  51. A. Tavanaei and A. S. Maida, “Bio-inspired spiking convolutional neural network using layer-wise sparse coding and stdp learning,” arXiv preprint arXiv:1611.03000, 2016.
  52. R. V. W. Putra and M. Shafique, “Fspinn: An optimization framework for memory-efficient and energy-efficient spiking neural networks,” IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems, vol. 39, no. 11, pp. 3601–3613, 2020.
  53. M. Rastogi, S. Lu, N. Islam, and A. Sengupta, “On the self-repair role of astrocytes in stdp enabled unsupervised snns,” Frontiers in Neuroscience, vol. 14, p. 603796, 2021.
  54. T. Masquelier and S. J. Thorpe, “Unsupervised learning of visual features through spike timing dependent plasticity,” PLoS Computational Biology, vol. 3, no. 2, p. e31, 2007.
  55. F. Liu, W. Zhao, Y. Chen, Z. Wang, T. Yang, and L. Jiang, “Sstdp: Supervised spike timing dependent plasticity for efficient spiking neural network training,” Frontiers in Neuroscience, vol. 15, p. 756876, 2021.

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets

This paper has been mentioned in 1 post and received 1 like.