Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
125 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

SiGNN: A Spike-induced Graph Neural Network for Dynamic Graph Representation Learning (2404.07941v1)

Published 11 Mar 2024 in cs.NE, cs.AI, and cs.LG

Abstract: In the domain of dynamic graph representation learning (DGRL), the efficient and comprehensive capture of temporal evolution within real-world networks is crucial. Spiking Neural Networks (SNNs), known as their temporal dynamics and low-power characteristic, offer an efficient solution for temporal processing in DGRL task. However, owing to the spike-based information encoding mechanism of SNNs, existing DGRL methods employed SNNs face limitations in their representational capacity. Given this issue, we propose a novel framework named Spike-induced Graph Neural Network (SiGNN) for learning enhanced spatialtemporal representations on dynamic graphs. In detail, a harmonious integration of SNNs and GNNs is achieved through an innovative Temporal Activation (TA) mechanism. Benefiting from the TA mechanism, SiGNN not only effectively exploits the temporal dynamics of SNNs but also adeptly circumvents the representational constraints imposed by the binary nature of spikes. Furthermore, leveraging the inherent adaptability of SNNs, we explore an in-depth analysis of the evolutionary patterns within dynamic graphs across multiple time granularities. This approach facilitates the acquisition of a multiscale temporal node representation.Extensive experiments on various real-world dynamic graph datasets demonstrate the superior performance of SiGNN in the node classification task.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (32)
  1. Tracking the evolution of communities in dynamic social networks. In 2010 International Conference on Advances in Social Networks Analysis and Mining, pages 176–183. IEEE, 2010.
  2. Semi-supervised classification with graph convolutional networks. In Proceedings of the 5th International Conference on Learning Representations, 2017.
  3. A decomposition dynamic graph convolutional recurrent network for traffic forecasting. Pattern Recognition, 142:109670, 2023.
  4. Detecting temporal protein complexes from dynamic protein-protein interaction networks. BMC bioinformatics, 15:1–14, 2014.
  5. Semi-supervised classification with graph convolutional networks. arXiv preprint arXiv:1609.02907, 2016.
  6. Learning phrase representations using rnn encoder-decoder for statistical machine translation. arXiv preprint arXiv:1406.1078, 2014.
  7. Long short-term memory based recurrent neural network architectures for large vocabulary speech recognition. arXiv preprint arXiv:1402.1128, 2014.
  8. Spiking neuron models: Single neurons, populations, plasticity. Cambridge university press, 2002.
  9. Rethinking the performance comparison between snns and anns. Neural networks, 121:294–307, 2020.
  10. Financial time series prediction using spiking neural networks. PloS one, 9(8):e103656, 2014.
  11. Deep spiking neural networks for large vocabulary automatic speech recognition. Frontiers in neuroscience, 14:199, 2020.
  12. Event-based video reconstruction via potential-assisted spiking neural network. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 3594–3604, 2022.
  13. Rmp-snn: Residual membrane potential neuron for enabling deeper high-accuracy and low-latency spiking neural network. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 13558–13567, 2020.
  14. Representation learning for dynamic graphs: A survey. The Journal of Machine Learning Research, 21(1):2648–2720, 2020.
  15. Embedding temporal network via neighborhood formation. In Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pages 2857–2866, 2018.
  16. Gaen: graph attention evolving networks. In Proceedings of the Thirtieth International Joint Conference on Artificial Intelligence (IJCAI), 2021.
  17. Predicting dynamic embedding trajectory in temporal interaction networks. In Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data mining, pages 1269–1278, 2019.
  18. Temporal graph networks for deep learning on dynamic graphs. arXiv preprint arXiv:2006.10637, 2020.
  19. Evolvegcn: Evolving graph convolutional networks for dynamic graphs. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 34, pages 5363–5370, 2020.
  20. Dyrep: Learning representations over dynamic graphs. In International conference on learning representations, 2019.
  21. Inductive representation learning on temporal graphs. arXiv preprint arXiv:2002.07962, 2020.
  22. Dysat: Deep neural representation learning on dynamic graphs via self-attention networks. In Proceedings of the 13th International Conference on Web Search and Data Mining, pages 519–527, 2020.
  23. Spiking-yolo: spiking neural network for energy-efficient object detection. In Proceedings of the AAAI conference on Artificial Intelligence, volume 34, pages 11270–11277, 2020.
  24. Optimized potential initialization for low-latency spiking neural networks. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 36, pages 11–20, 2022.
  25. Spike: Spike-based embeddings for multi-relational graph data. In 2021 International Joint Conference on Neural Networks (IJCNN), pages 1–8. IEEE, 2021.
  26. Spiking graph convolutional networks. arXiv preprint arXiv:2205.02767, 2022.
  27. A graph is worth 1-bit spikes: When graph contrastive learning meets spiking neural networks. arXiv preprint arXiv:2305.19306, 2023.
  28. Scaling up dynamic graph representation learning via spiking neural networks. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 37, pages 8588–8596, 2023.
  29. Neuronal dynamics: From single neurons to networks and models of cognition. Cambridge University Press, 2014.
  30. Incorporating learnable membrane time constant to enhance learning of spiking neural networks. In Proceedings of the IEEE/CVF International Conference on Computer Vision, pages 2661–2671, 2021.
  31. Temporal network embedding with micro-and macro-dynamics. In Proceedings of the 28th ACM International Conference on Information and Knowledge Management, pages 469–478, 2019.
  32. The nber patent citation data file: Lessons, insights and methodological tools, 2001.
Citations (1)

Summary

We haven't generated a summary for this paper yet.