Papers
Topics
Authors
Recent
Search
2000 character limit reached

Deep learning for dynamic graphs: models and benchmarks

Published 12 Jul 2023 in cs.LG and cs.SI | (2307.06104v4)

Abstract: Recent progress in research on Deep Graph Networks (DGNs) has led to a maturation of the domain of learning on graphs. Despite the growth of this research field, there are still important challenges that are yet unsolved. Specifically, there is an urge of making DGNs suitable for predictive tasks on realworld systems of interconnected entities, which evolve over time. With the aim of fostering research in the domain of dynamic graphs, at first, we survey recent advantages in learning both temporal and spatial information, providing a comprehensive overview of the current state-of-the-art in the domain of representation learning for dynamic graphs. Secondly, we conduct a fair performance comparison among the most popular proposed approaches on node and edge-level tasks, leveraging rigorous model selection and assessment for all the methods, thus establishing a sound baseline for evaluating new architectures and approaches

Definition Search Book Streamline Icon: https://streamlinehq.com
References (92)
  1. J. Gilmer, S. S. Schoenholz, P. F. Riley, O. Vinyals, and G. E. Dahl, “Neural Message Passing for Quantum Chemistry,” in Proc. of the 34th ICML, vol. 70.   JMLR, 2017, p. 1263–1272.
  2. M. Zitnik, M. Agrawal, and J. Leskovec, “Modeling polypharmacy side effects with graph convolutional networks,” Bioinformatics, vol. 34, no. 13, pp. i457–i466, 2018.
  3. A. Gravina, J. L. Wilson, D. Bacciu, K. J. Grimes, and C. Priami, “Controlling astrocyte-mediated synaptic pruning signals for schizophrenia drug repurposing with deep graph networks,” PLoS Comput. Biol., vol. 18, no. 5, pp. 1–19, 2022.
  4. D. Bacciu, F. Errica, A. Gravina, L. Madeddu, M. Podda, and G. Stilo, “Deep Graph Networks for Drug Repurposing with Multi-Protein Targets,” IEEE TETC, pp. 1–14, 2023.
  5. F. Monti, F. Frasca, D. Eynard, D. Mannion, and M. M. Bronstein, “Fake News Detection on Social Media using Geometric Deep Learning,” arXiv preprint arXiv:1902.06673, 2019.
  6. A. Derrow-Pinion, J. She, D. Wong, O. Lange, T. Hester, L. Perez, M. Nunkesser, S. Lee, X. Guo, B. Wiltshire, P. W. Battaglia, V. Gupta, A. Li, Z. Xu, A. Sanchez-Gonzalez, Y. Li, and P. Velickovic, “ETA Prediction with Graph Neural Networks in Google Maps,” in Proc. of the 30th ACM CIKM.   Association for Computing Machinery, 2021, p. 3767–3776.
  7. D. Bacciu, F. Errica, A. Micheli, and M. Podda, “A gentle introduction to deep learning for graphs,” Neural Networks, vol. 129, pp. 203–221, 2020.
  8. Z. Wu, S. Pan, F. Chen, G. Long, C. Zhang, and P. S. Yu, “A Comprehensive Survey on Graph Neural Networks,” IEEE TNNLS, vol. 32, no. 1, pp. 4–24, 2021.
  9. L. Zhao, Y. Song, C. Zhang, Y. Liu, P. Wang, T. Lin, M. Deng, and H. Li, “T-GCN: A Temporal Graph Convolutional Network for Traffic Prediction,” IEEE T-ITS, vol. 21, no. 9, pp. 3848–3858, 2020.
  10. E. Rossi, B. Chamberlain, F. Frasca, D. Eynard, F. Monti, and M. Bronstein, “Temporal Graph Networks for Deep Learning on Dynamic Graphs,” in ICML 2020 Workshop on Graph Representation Learning, 2020.
  11. R. Trivedi, M. Farajtabar, P. Biswal, and H. Zha, “DyRep: Learning Representations over Dynamic Graphs,” in ICLR, 2019.
  12. D. Xu, C. Ruan, E. Korpeoglu, S. Kumar, and K. Achan, “Inductive representation learning on temporal graphs,” in ICLR, 2020.
  13. S. M. Kazemi, R. Goel, K. Jain, I. Kobyzev, A. Sethi, P. Forsyth, and P. Poupart, “Representation Learning for Dynamic Graphs: A Survey,” J. Mach. Learn. Res., vol. 21, no. 1, 2020.
  14. W. Jiang and J. Luo, “Graph neural network for traffic forecasting: A survey,” Expert Systems with Applications, vol. 207, p. 117921, 2022.
  15. F. Scarselli, M. Gori, A. C. Tsoi, M. Hagenbuchner, and G. Monfardini, “The Graph Neural Network Model,” IEEE Transactions on Neural Networks, vol. 20, no. 1, pp. 61–80, 2009.
  16. A. Micheli, “Neural Network for Graphs: A Contextual Constructive Approach,” IEEE Transactions on Neural Networks, vol. 20, no. 3, pp. 498–511, 2009.
  17. M. Defferrard, X. Bresson, and P. Vandergheynst, “Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering,” in Proc. of the 29th NeurIPS.   Curran Associates Inc., 2016, p. 3844–3852.
  18. T. N. Kipf and M. Welling, “Semi-Supervised Classification with Graph Convolutional Networks,” in ICLR, 2017.
  19. P. Veličković, G. Cucurull, A. Casanova, A. Romero, P. Liò, and Y. Bengio, “Graph Attention Networks,” ICLR, 2018.
  20. W. L. Hamilton, R. Ying, and J. Leskovec, “Inductive Representation Learning on Large Graphs,” in NeurIPS, 2017.
  21. W.-L. Chiang, X. Liu, S. Si, Y. Li, S. Bengio, and C.-J. Hsieh, “Cluster-GCN: An Efficient Algorithm for Training Deep and Large Graph Convolutional Networks,” in Proc. of the 25th ACM SIGKDD KDD.   Association for Computing Machinery, 2019, p. 257–266.
  22. K. Xu, W. Hu, J. Leskovec, and S. Jegelka, “How Powerful are Graph Neural Networks?” in ICLR, 2019.
  23. B. Weisfeiler and A. Lehman, “A Reduction of a Graph to a Canonical Form and an Algebra Arising during This Reduction,” Nauchno-Technicheskaya Informatsia, vol. 2, no. 9, 1968.
  24. A. Gravina, D. Bacciu, and C. Gallicchio, “Anti-Symmetric DGN: a stable architecture for Deep Graph Networks,” in ICLR, 2023.
  25. Y. Wang, Y. Wang, J. Yang, and Z. Lin, “Dissecting the Diffusion Process in Linear Graph Convolutional Networks,” in NeurIPS, 2021.
  26. F. Wu, A. Souza, T. Zhang, C. Fifty, T. Yu, and K. Weinberger, “Simplifying Graph Convolutional Networks,” in Proc. of the 36th ICML, vol. 97.   PMLR, 2019, pp. 6861–6871.
  27. M. Eliasof, E. Haber, and E. Treister, “PDE-GCN: Novel Architectures for Graph Neural Networks Motivated by Partial Differential Equations,” in NeurIPS, 2021.
  28. T. K. Rusch, B. P. Chamberlain, J. Rowbottom, S. Mishra, and M. M. Bronstein, “Graph-Coupled Oscillator Networks,” arXiv preprint arXiv:2202.02296, 2022.
  29. B. Perozzi, R. Al-Rfou, and S. Skiena, “DeepWalk: Online Learning of Social Representations,” in Proc. of the 20th ACM SIGKDD KDD.   ACM, 2014, pp. 701–710.
  30. T. Mikolov, K. Chen, G. Corrado, and J. Dean, “Efficient Estimation of Word Representations in Vector Space,” in ICLR, 2013.
  31. A. Grover and J. Leskovec, “node2vec: Scalable Feature Learning for Networks,” in Proc. of the 22nd ACM SIGKDD KDD, 2016.
  32. X. Gao, B. Xiao, D. Tao, and X. Li, “A survey of graph edit distance,” Pattern Analysis and Applications, vol. 13, no. 1, pp. 113–129, 2010.
  33. B. Paassen, D. Grattarola, D. Zambon, C. Alippi, and B. E. Hammer, “Graph edit networks,” in Proc. of ICLR 2021, 2021.
  34. D. E. Rumelhart, G. E. Hinton, and R. J. Williams, “Learning representations by back-propagating errors,” Nature, vol. 323, no. 6088, pp. 533–536, 1986.
  35. Y. Seo, M. Defferrard, P. Vandergheynst, and X. Bresson, “Structured Sequence Modeling with Graph Convolutional Recurrent Networks,” in NeurIPS, 2018.
  36. F. Gers, N. Schraudolph, and J. Schmidhuber, “Learning Precise Timing with LSTM Recurrent Networks,” Journal of Machine Learning Research, vol. 3, pp. 115–143, 2002.
  37. A. Graves, “Generating sequences with recurrent neural networks,” arXiv preprint arXiv:1308.0850, 2013.
  38. Y. Li, R. Yu, C. Shahabi, and Y. Liu, “Diffusion Convolutional Recurrent Neural Network: Data-Driven Traffic Forecasting,” in ICLR, 2018.
  39. K. Cho, B. van Merrienboer, C. Gulcehre, D. Bahdanau, F. Bougares, H. Schwenk, and Y. Bengio, “Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation,” arXiv preprint arXiv:1406.1078, 2014.
  40. J. Bai, J. Zhu, Y. Song, L. Zhao, Z. Hou, R. Du, and H. Li, “A3T-GCN: Attention Temporal Graph Convolutional Network for Traffic Forecasting,” ISPRS International Journal of Geo-Information, vol. 10, no. 7, 2021.
  41. B. Yu, H. Yin, and Z. Zhu, “Spatio-temporal Graph Convolutional Networks: A Deep Learning Framework for Traffic Forecasting,” in Proc. of the 27th IJCAI, 2018.
  42. Y. N. Dauphin, A. Fan, M. Auli, and D. Grangier, “Language Modeling with Gated Convolutional Networks,” in Proc. of the 34th ICML, vol. 70.   JMLR, 2017, p. 933–941.
  43. S. Guo, Y. Lin, N. Feng, C. Song, and H. Wan, “Attention based spatial-temporal graph convolutional networks for traffic flow forecasting,” Proc. of the AAAI Conference on Artificial Intelligence, vol. 33, no. 01, pp. 922–929, 2019.
  44. J. Chen, X. Xu, Y. Wu, and H. Zheng, “GC-LSTM: Graph convolution embedded lstm for dynamic link prediction,” arXiv preprint arXiv:1812.04206, 2018.
  45. J. Li, Z. Han, H. Cheng, J. Su, P. Wang, J. Zhang, and L. Pan, “Predicting Path Failure In Time-Evolving Graphs,” in Proc. of the 25th ACM SIGKDD KDD.   Association for Computing Machinery, 2019, p. 1279–1289.
  46. M. Schlichtkrull, T. N. Kipf, P. Bloem, R. van den Berg, I. Titov, and M. Welling, “Modeling Relational Data with Graph Convolutional Networks,” in The Semantic Web.   Springer International Publishing, 2018, pp. 593–607.
  47. A. Micheli and D. Tortorella, “Discrete-time dynamic graph echo state networks,” Neurocomputing, vol. 496, pp. 85–95, 2022.
  48. C. Gallicchio and A. Micheli, “Graph echo state networks,” in IJCNN.   IEEE, 2010, pp. 1–8.
  49. G. Panagopoulos, G. Nikolentzos, and M. Vazirgiannis, “Transfer Graph Neural Networks for Pandemic Forecasting,” in Proc. of the 35th AAAI Conference on Artificial Intelligence, 2021.
  50. J. You, T. Du, and J. Leskovec, “ROLAND: graph learning framework for dynamic graphs,” in Proc. of the 28th ACM SIGKDD KDD, 2022, pp. 2358–2366.
  51. S. Deng, H. Rangwala, and Y. Ning, “Learning dynamic context graphs for predicting social events,” in Proc. of the 25th ACM SIGKDD KDD.   Association for Computing Machinery, 2019, p. 1007–1016.
  52. A. Cini, I. Marisca, F. Bianchi, and C. Alippi, “Scalable Spatiotemporal Graph Neural Networks,” in Proc. of the AAAI Conference on Artificial Intelligence, 2023.
  53. H. Jaeger, “The “echo state” approach to analysing and training recurrent neural networks–with an erratum note,” German National Research Center for Information Technology GMD Technical Report, vol. 148, no. 34, 2010.
  54. H. Jaeger and H. Haas, “Harnessing Nonlinearity: Predicting Chaotic Systems and Saving Energy in Wireless Communication,” Science, vol. 304, no. 5667, pp. 78–80, 2004.
  55. C. Gallicchio and S. Scardapane, “Deep randomized neural networks,” in Recent Trends in Learning From Data: Tutorials from the INNS Big Data and Deep Learning Conference.   Springer, 2020, pp. 43–68.
  56. A. Pareja, G. Domeniconi, J. Chen, T. Ma, T. Suzumura, H. Kanezashi, T. Kaler, T. B. Schardl, and C. E. Leiserson, “EvolveGCN: Evolving Graph Convolutional Networks for Dynamic Graphs,” in Proc. of the 34th AAAI Conference on Artificial Intelligence, 2020.
  57. A. Taheri and T. Berger-Wolf, “Predictive Temporal Embedding of Dynamic Graphs,” in Proc. of the 2019 IEEE/ACM ASONAM.   Association for Computing Machinery, 2019, p. 57–64.
  58. Y. Li, R. Zemel, M. Brockschmidt, and D. Tarlow, “Gated Graph Sequence Neural Networks,” in ICLR, 2016.
  59. P. Goyal, S. R. Chhetri, and A. Canedo, “dyngraph2vec: Capturing network dynamics using dynamic graph representation learning,” Knowledge-Based Systems, vol. 187, p. 104816, 2020.
  60. P. Goyal, N. Kamra, X. He, and Y. Liu, “DynGEM: Deep Embedding Method for Dynamic Graphs,” arXiv preprint arXiv:1805.11273, 2018.
  61. N. Bastas, T. Semertzidis, A. Axenopoulos, and P. Daras, “evolve2vec: Learning Network Representations Using Temporal Unfolding,” in MultiMedia Modeling.   Springer International Publishing, 2019, pp. 447–458.
  62. S. Kumar, X. Zhang, and J. Leskovec, “Predicting Dynamic Embedding Trajectory in Temporal Interaction Networks,” in Proc. of the 25th ACM SIGKDD KDD, 2019.
  63. Y. Ma, Z. Guo, Z. Ren, J. Tang, and D. Yin, “Streaming Graph Neural Networks,” in Proc. of the 43rd International ACM SIGIR.   Association for Computing Machinery, 2020, p. 719–728.
  64. G. Nguyen, J. B. Lee, R. A. Rossi, N. Ahmed, E. Koh, and S. Kim, “Continuous-Time Dynamic Network Embeddings,” Companion Proc. of the Web Conference, 2018.
  65. Y. Wang, Y.-Y. Chang, Y. Liu, J. Leskovec, and P. Li, “Inductive Representation Learning in Temporal Networks via Causal Anonymous Walks,” in ICLR, 2021.
  66. M. Jin, Y.-F. Li, and S. Pan, “Neural Temporal Walks: Motif-Aware Representation Learning on Continuous-Time Dynamic Graphs,” in NeurIPS, 2022.
  67. A. H. Souza, D. Mesquita, S. Kaski, and V. K. Garg, “Provably expressive temporal graph networks,” in NeurIPS, 2022.
  68. F. Errica, D. Bacciu, and A. Micheli, “PyDGN: a Python Library for Flexible and Reproducible Research on Deep Learning for Graphs,” Journal of Open Source Software, vol. 8, no. 90, 2023.
  69. M. Fey and J. E. Lenssen, “Fast graph representation learning with PyTorch Geometric,” in ICLR Workshop on Representation Learning on Graphs and Manifolds, 2019.
  70. J. Leskovec and A. Krevl, “SNAP Datasets: Stanford large network dataset collection,” http://snap.stanford.edu/data, 2014.
  71. A. Cini and I. Marisca, “Torch spatiotemporal, 3 2022,” https://github.com/TorchSpatiotemporal/tsl, vol. 10.
  72. S. Huang, F. Poursafaei, J. Danovitch, M. Fey, W. Hu, E. Rossi, J. Leskovec, M. M. Bronstein, G. Rabusseau, and R. Rabbany, “Temporal Graph Benchmark for Machine Learning on Temporal Graphs,” in NeurIPS Datasets and Benchmarks Track, 2023.
  73. R. A. Rossi and N. K. Ahmed, “The network data repository with interactive graph analytics and visualization,” in AAAI, 2015. [Online]. Available: https://networkrepository.com
  74. B. Rozemberczki, P. Scherer, Y. He, G. Panagopoulos, A. Riedel, M. Astefanoaei, O. Kiss, F. Beres, G. Lopez, N. Collignon, and R. Sarkar, “PyTorch Geometric Temporal: Spatiotemporal Signal Processing with Neural Machine Learning Models,” in Proc. of the 30th ACM CIKM, 2021, p. 4564–4573.
  75. F. Béres, R. Pálovics, A. Oláh, and A. A. Benczúr, “Temporal walk based centrality metric for graph streams,” Applied Network Science, vol. 3, no. 1, p. 32, 2018.
  76. M. Weber, G. Domeniconi, J. Chen, D. Weidele, C. Bellei, T. Robinson, and C. Leiserson, “Anti-Money Laundering in Bitcoin: Experimenting with Graph Convolutional Networks for Financial Forensics,” KDD ’19 Workshop on Anomaly Detection in Finance, 2019.
  77. J. Leskovec, J. Kleinberg, and C. Faloutsos, “Graphs over Time: Densification Laws, Shrinking Diameters and Possible Explanations,” in Proc. of the 11th ACM SIGKDD KDD.   Association for Computing Machinery, 2005, p. 177–187.
  78. S. Kumar, F. Spezzano, V. Subrahmanian, and C. Faloutsos, “Edge weight prediction in weighted signed networks,” in IEEE ICDM.   IEEE, 2016, pp. 221–230.
  79. S. Kumar, B. Hooi, D. Makhija, M. Kumar, C. Faloutsos, and V. Subrahmanian, “Rev2: Fraudulent user prediction in rating platforms,” in Proc. of the 11th ACM WSDM.   ACM, 2018, pp. 333–341.
  80. F. Poursafaei, S. Huang, K. Pelrine, , and R. Rabbany, “Towards better evaluation for dynamic link prediction,” in NeurIPS Datasets and Benchmarks, 2022.
  81. H. Pei, B. Wei, K. C.-C. Chang, Y. Lei, and B. Yang, “Geom-GCN: Geometric Graph Convolutional Networks,” in ICLR, 2020.
  82. Y. Yan, M. Hashemi, K. Swersky, Y. Yang, and D. Koutra, “Two Sides of the Same Coin: Heterophily and Oversmoothing in Graph Convolutional Neural Networks,” in IEEE ICDM, 2022, pp. 1287–1292.
  83. A. Cavallo, C. Grohnfeldt, M. Russo, G. Lovisotto, and L. Vassio, “GCNH: A Simple Method For Representation Learning On Heterophilous Graphs,” arXiv preprint arXiv:2304.10896, 2023.
  84. S. Ji, S. Pan, E. Cambria, P. Marttinen, and P. S. Yu, “A Survey on Knowledge Graphs: Representation, Acquisition, and Applications,” IEEE TNNLS, vol. 33, no. 2, pp. 494–514, 2022.
  85. Z. Li, H. Liu, Z. Zhang, T. Liu, and N. N. Xiong, “Learning Knowledge Graph Embedding With Heterogeneous Relation Attention Networks,” IEEE TNNLS, vol. 33, no. 8, pp. 3961–3973, 2022.
  86. L. Deng, D. Lian, Z. Huang, and E. Chen, “Graph Convolutional Adversarial Networks for Spatiotemporal Anomaly Detection,” IEEE TNNLS, vol. 33, no. 6, pp. 2416–2428, 2022.
  87. C. Cai and Y. Wang, “A note on over-smoothing for graph neural networks,” arXiv preprint arXiv:2006.13318, 2020.
  88. U. Alon and E. Yahav, “On the Bottleneck of Graph Neural Networks and its Practical Implications,” in ICLR, 2021.
  89. F. D. Giovanni, L. Giusti, F. Barbero, G. Luise, P. Lio’, and M. Bronstein, “On Over-Squashing in Message Passing Neural Networks: The Impact of Width, Depth, and Topology,” arXiv preprint arXiv:2302.02941, 2023.
  90. P. Li and J. Leskovec, “The expressive power of graph neural networks,” in Graph Neural Networks: Foundations, Frontiers, and Applications, L. Wu, P. Cui, J. Pei, and L. Zhao, Eds.   Springer Singapore, 2022, pp. 63–98.
  91. J. Gan, R. Hu, Y. Mo, Z. Kang, L. Peng, Y. Zhu, and X. Zhu, “Multigraph Fusion for Dynamic Graph Convolutional Network,” IEEE TNNLS, pp. 1–12, 2022.
  92. W. L. Hamilton, R. Ying, and J. Leskovec, “Representation learning on graphs: Methods and applications,” IEEE Data Eng. Bull., vol. 40, no. 3, pp. 52–74, 2017. [Online]. Available: http://sites.computer.org/debull/A17sept/p52.pdf
Citations (8)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.