Continual Learning on Graphs: Challenges, Solutions, and Opportunities (2402.11565v1)
Abstract: Continual learning on graph data has recently attracted paramount attention for its aim to resolve the catastrophic forgetting problem on existing tasks while adapting the sequentially updated model to newly emerged graph tasks. While there have been efforts to summarize progress on continual learning research over Euclidean data, e.g., images and texts, a systematic review of progress in continual learning on graphs, a.k.a, continual graph learning (CGL) or lifelong graph learning, is still demanding. Graph data are far more complex in terms of data structures and application scenarios, making CGL task settings, model designs, and applications extremely challenging. To bridge the gap, we provide a comprehensive review of existing continual graph learning (CGL) algorithms by elucidating the different task settings and categorizing the existing methods based on their characteristics. We compare the CGL methods with traditional continual learning techniques and analyze the applicability of the traditional continual learning techniques to CGL tasks. Additionally, we review the benchmark works that are crucial to CGL research. Finally, we discuss the remaining challenges and propose several future directions. We will maintain an up-to-date GitHub repository featuring a comprehensive list of CGL algorithms, accessible at https://github.com/UConn-DSIS/Survey-of-Continual-Learning-on-Graphs.
- X. Zhang, D. Song, and D. Tao, “Cglb: Benchmark tasks for continual graph learning,” in Thirty-sixth Conference on Neural Information Processing Systems Datasets and Benchmarks Track, 2022.
- H. Liu, Y. Yang, and X. Wang, “Overcoming catastrophic forgetting in graph neural networks,” in Proceedings of the AAAI conference on artificial intelligence, vol. 35, no. 10, 2021, pp. 8653–8661.
- X. Zhang, D. Song, and D. Tao, “Hierarchical prototype networks for continual graph representation learning,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 45, no. 4, pp. 4622–4636, 2023.
- L. Galke, I. Vagliano, B. Franke, T. Zielke, M. Hoffmann, and A. Scherp, “Lifelong learning on evolving graphs under the constraints of imbalanced classes and new classes,” Neural Networks, vol. 164, pp. 156–176, 2023.
- Q. Shen, W. Ren, and W. Qin, “Graph relation aware continual learning,” arXiv preprint arXiv:2308.08259, 2023.
- G. Lombardo, A. Poggi, and M. Tomaiuolo, “Continual representation learning for node classification in power-law graphs,” Future Generation Computer Systems, vol. 128, pp. 420–428, 2022.
- X. Kou, Y. Lin, S. Liu, P. Li, J. Zhou, and Y. Zhang, “Disentangle-based continual graph representation learning,” arXiv preprint arXiv:2010.02565, 2020.
- A. Carta, A. Cossu, F. Errica, and D. Bacciu, “Catastrophic forgetting in deep graph networks: A graph classification benchmark,” Frontiers in artificial intelligence, vol. 5, p. 824655, 2022.
- M. Mirtaheri, M. Rostami, and A. Galstyan, “History repeats: Overcoming catastrophic forgetting for event-centric temporal knowledge graph completion,” arXiv preprint arXiv:2305.18675, 2023.
- Q. Yuan, S.-U. Guan, P. Ni, T. Luo, K. L. Man, P. Wong, and V. Chang, “Continual graph learning: A survey,” arXiv preprint arXiv:2301.12230, 2023.
- F. G. Febrinanto, F. Xia, K. Moore, C. Thapa, and C. Aggarwal, “Graph lifelong learning: A survey,” IEEE Computational Intelligence Magazine, vol. 18, no. 1, pp. 32–51, 2023.
- H.-J. Song and S.-B. Park, “Enriching translation-based knowledge graph embeddings through continual learning,” IEEE Access, vol. 6, pp. 60 489–60 497, 2018.
- Y. Ren, L. Ke, D. Li, H. Xue, Z. Li, and S. Zhou, “Incremental graph classification by class prototype construction and augmentation,” in Proceedings of the 32nd ACM International Conference on Information and Knowledge Management, 2023, pp. 2136–2145.
- H. Lin, R. Jia, and X. Lyu, “Gated attention with asymmetric regularization for transformer-based continual graph learning,” in Proceedings of the 46th International ACM SIGIR Conference on Research and Development in Information Retrieval, 2023, pp. 2021–2025.
- K. Javed and M. White, “Meta-learning representations for continual learning,” Advances in neural information processing systems, vol. 32, 2019.
- X. Chen, J. Wang, and K. Xie, “Trafficstream: A streaming traffic flow forecasting framework based on graph neural networks and continual learning,” arXiv preprint arXiv:2106.06273, 2021.
- Z. Tan, M. Hu, Y. Wang, L. Wei, and B. Liu, “Futures quantitative investment with heterogeneous continual graph neural network,” arXiv preprint arXiv:2303.16532, 2023.
- M. Perini, G. Ramponi, P. Carbone, and V. Kalavri, “Learning on streaming graphs with experience replay,” in Proceedings of the 37th ACM/SIGAPP Symposium on Applied Computing, 2022, pp. 470–478.
- S. Gupta, S. Manchanda, S. Ranu, and S. J. Bedathur, “Grafenne: learning on graphs with heterogeneous and dynamic feature sets,” in International Conference on Machine Learning. PMLR, 2023, pp. 12 165–12 181.
- X. Chen, J. Zhang, X. Wang, T. Wu, S. Deng, Y. Wang, L. Si, H. Chen, and N. Zhang, “Continual multimodal knowledge graph construction,” arXiv preprint arXiv:2305.08698, 2023.
- B. Wang, Y. Zhang, X. Wang, P. Wang, Z. Zhou, L. Bai, and Y. Wang, “Pattern expansion and consolidation on evolving graphs for continual traffic prediction,” in Proceedings of the 29th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, 2023, pp. 2223–2232.
- J. Omeliyanenko, A. Zehe, A. Hotho, and D. Schlör, “Capskg: Enabling continual knowledge integration in language models for automatic knowledge graph completion,” in International Semantic Web Conference. Springer, 2023, pp. 618–636.
- Y. Wang, Y. Cui, W. Liu, Z. Sun, Y. Jiang, K. Han, and W. Hu, “Facing changes: Continual entity alignment for growing knowledge graphs,” in International Semantic Web Conference. Springer, 2022, pp. 196–213.
- G. M. Van de Ven and A. S. Tolias, “Three scenarios for continual learning,” arXiv preprint arXiv:1904.07734, 2019.
- G. I. Parisi, R. Kemker, J. L. Part, C. Kanan, and S. Wermter, “Continual lifelong learning with neural networks: A review,” Neural networks, vol. 113, pp. 54–71, 2019.
- Z. Li and D. Hoiem, “Learning without forgetting,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 40, no. 12, pp. 2935–2947, 2017.
- M. Masana, X. Liu, B. Twardowski, M. Menta, A. D. Bagdanov, and J. van de Weijer, “Class-incremental learning: Survey and performance evaluation on image classification,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 45, no. 5, pp. 5513–5533, 2023.
- T. N. Kipf and M. Welling, “Semi-supervised classification with graph convolutional networks,” arXiv preprint arXiv:1609.02907, 2016.
- C. Zheng, B. Zong, W. Cheng, D. Song, J. Ni, W. Yu, H. Chen, and W. Wang, “Robust graph representation learning via neural sparsification,” in Proceedings of the 37th International Conference on Machine Learning (ICML), July 2020.
- P. Veličković, G. Cucurull, A. Casanova, A. Romero, P. Lio, and Y. Bengio, “Graph attention networks,” arXiv preprint arXiv:1710.10903, 2017.
- L. Wang, B. Zong, Q. Ma, W. Cheng, J. Ni, W. Yu, Y. Liu, D. Song, H. Chen, and Y. Fu, “Inductive and unsupervised representation learning on graph structured objects,” in Proceedings of International Conference on Learning Representations (ICLR), April 2020.
- C. Zhang*, D. Song, C. Huang, A. Swami, and N. V. Chawla, “Heterogeneous graph neural network,” in Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD), Anchorage, AK, USA, August 2019, pp. 793–803.
- X. Zhang, C. Xu, and D. Tao, “Context aware graph convolution for skeleton-based action recognition,” in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2020, pp. 14 333–14 342.
- W. Yu, C. Zhen, W. Cheng, C. Aggarwal, D. Song, B. Zong, H. Chen, and W. Wang, “Learning deep network representations with adversarially regularized autoencoders,” in Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD), London, UK, August 2018, pp. 2,663–2,671.
- H. Lei, N. Akhtar, and A. Mian, “Spherical kernel for efficient graph convolution on 3d point clouds,” IEEE Transactions on Pattern Analysis and Machine Intelligence, 2020.
- Y. Xie, Z. Xu, J. Zhang, Z. Wang, and S. Ji, “Self-supervised learning of graph neural networks: A unified review,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 45, no. 2, pp. 2412–2429, 2023.
- R. Zhang, Y. Zhang, C. Lu, and X. Li, “Unsupervised graph embedding via adaptive graph learning,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 45, no. 4, pp. 5329–5336, 2023.
- C. Zheng, B. Zong, W. Cheng, D. Song, J. Ni, W. Yu, H. Chen, and W. Wang, “Node classification in temporal graphs through stochastic sparsification and temporal structural convolution,” in Proceedings of European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECML-PKDD), September 2020.
- L. Lü and T. Zhou, “Link prediction in complex networks: A survey,” Physica A: statistical mechanics and its applications, vol. 390, no. 6, pp. 1150–1170, 2011.
- C. Zhang, H. Yao, L. Yu, C. Huang, D. Song, M. Jiang, H. Chen, and N. V. Chawla, “Inductive contextual relation learning for personalization,” ACM Transactions on Information Systems (TOIS), vol. 39, no. 35, pp. 1–22, 2021.
- L. Cai, J. Li, J. Wang, and S. Ji, “Line graph neural networks for link prediction,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 44, no. 9, pp. 5103–5113, 2021.
- J. Gilmer, S. S. Schoenholz, P. F. Riley, O. Vinyals, and G. E. Dahl, “Neural message passing for quantum chemistry,” in International Conference on Machine Learning. PMLR, 2017, pp. 1263–1272.
- J. T. Vogelstein, W. G. Roncal, R. J. Vogelstein, and C. E. Priebe, “Graph classification using signal-subgraphs: Applications in statistical connectomics,” IEEE transactions on pattern analysis and machine intelligence, vol. 35, no. 7, pp. 1539–1551, 2012.
- T. T. Mueller, J. C. Paetzold, C. Prabhakar, D. Usynin, D. Rueckert, and G. Kaissis, “Differentially private graph neural networks for whole-graph classification,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 45, no. 6, pp. 7308–7318, 2023.
- K. Du, F. Lyu, L. Li, F. Hu, W. Feng, F. Xu, X. Xi, and H. Cheng, “Multi-label continual learning using augmented graph convolutional network,” IEEE Transactions on Multimedia, 2023.
- H. Bo, R. McConville, J. Hong, and W. Liu, “Ego-graph replay based continual learning for misinformation engagement prediction,” in 2022 International Joint Conference on Neural Networks (IJCNN). IEEE, 2022, pp. 01–08.
- L. Hedegaard, N. Heidari, and A. Iosifidis, “Continual spatio-temporal graph convolutional networks,” Pattern Recognition, vol. 140, p. 109528, 2023.
- A. Zaman, F. Yangyu, M. S. Ayub, M. Irfan, L. Guoyun, and L. Shiya, “Cmdgat: Knowledge extraction and retention based continual graph attention network for point cloud registration,” Expert Systems with Applications, vol. 214, p. 119098, 2023.
- L.-P. Xhonneux, M. Qu, and J. Tang, “Continuous graph neural networks,” in International Conference on Machine Learning. PMLR, 2020, pp. 10 432–10 441.
- Y. Luo, Z. Huang, Z. Zhang, Z. Wang, M. Baktashmotlagh, and Y. Yang, “Learning from the past: continual meta-learning with bayesian graph neural networks,” in Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, no. 04, 2020, pp. 5021–5028.
- B. Tang and D. S. Matteson, “Graph-based continual learning,” arXiv preprint arXiv:2007.04813, 2020.
- B. Das and E. Isufi, “Graph filtering over expanding graphs,” in 2022 IEEE Data Science and Learning Workshop (DSLW). IEEE, 2022, pp. 1–8.
- ——, “Learning expanding graphs for signal interpolation,” in ICASSP 2022-2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2022, pp. 5917–5921.
- ——, “Online filtering over expanding graphs,” in 2022 56th Asilomar Conference on Signals, Systems, and Computers. IEEE, 2022, pp. 43–47.
- T. Wu, Q. Liu, Y. Cao, Y. Huang, X.-M. Wu, and J. Ding, “Continual graph convolutional network for text classification,” arXiv preprint arXiv:2304.04152, 2023.
- L. Galke, B. Franke, T. Zielke, and A. Scherp, “Lifelong learning of graph neural networks for open-world node classification,” in 2021 International Joint Conference on Neural Networks (IJCNN). IEEE, 2021, pp. 1–8.
- J. Wang, G. Song, Y. Wu, and L. Wang, “Streaming graph neural networks via continual learning,” in Proceedings of the 29th ACM International Conference on Information & Knowledge Management, 2020, pp. 1515–1524.
- Y. Han, S. Karunasekera, and C. Leckie, “Graph neural networks with continual learning for fake news detection from social media,” arXiv preprint arXiv:2007.03316, 2020.
- W. Yu, W. Cheng, C. C. Aggarwal, K. Zhang, H. Chen, and W. Wang, “Netwalk: A flexible deep embedding approach for anomaly detection in dynamic networks,” in Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, 2018, pp. 2672–2681.
- G. H. Nguyen, J. B. Lee, R. A. Rossi, N. K. Ahmed, E. Koh, and S. Kim, “Continuous-time dynamic network embeddings,” in Companion Proceedings of the The Web Conference 2018, 2018, pp. 969–976.
- L. Zhou, Y. Yang, X. Ren, F. Wu, and Y. Zhuang, “Dynamic network embedding by modeling triadic closure process,” in Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32, no. 1, 2018.
- Y. Ma, Z. Guo, Z. Ren, J. Tang, and D. Yin, “Streaming graph neural networks,” in Proceedings of the 43rd International ACM SIGIR Conference on Research and Development in Information Retrieval, 2020, pp. 719–728.
- Y. Feng, J. Jiang, and Y. Gao, “Incremental learning on growing graphs,” 2020.
- P. Bielak, K. Tagowski, M. Falkiewicz, T. Kajdanowicz, and N. V. Chawla, “Fildne: A framework for incremental learning of dynamic networks embeddings,” Knowledge-Based Systems, vol. 236, p. 107453, 2022.
- B. He, X. He, Y. Zhang, R. Tang, and C. Ma, “Dynamically expandable graph convolution for streaming recommendation,” arXiv preprint arXiv:2303.11700, 2023.
- F. Zhou, C. Cao, K. Zhang, G. Trajcevski, T. Zhong, and J. Geng, “Meta-gnn: On few-shot node classification in graph meta-learning,” in Proceedings of the 28th ACM International Conference on Information and Knowledge Management, 2019, pp. 2357–2360.
- Z. Guo, C. Zhang, W. Yu, J. Herr, O. Wiest, M. Jiang, and N. V. Chawla, “Few-shot graph learning for molecular property prediction,” in Proceedings of the Web Conference 2021, 2021, pp. 2559–2567.
- H. Yao, C. Zhang, Y. Wei, M. Jiang, S. Wang, J. Huang, N. Chawla, and Z. Li, “Graph few-shot learning via knowledge transfer,” in Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, no. 04, 2020, pp. 6656–6663.
- Z. Tan, K. Ding, R. Guo, and H. Liu, “Graph few-shot class-incremental learning,” in Proceedings of the Fifteenth ACM International Conference on Web Search and Data Mining, 2022, pp. 987–996.
- V. Garcia and J. Bruna, “Few-shot learning with graph neural networks,” arXiv preprint arXiv:1711.04043, 2017.
- T. D. Hoang, D. V. Tung, D.-H. Nguyen, B.-S. Nguyen, H. H. Nguyen, and H. Le, “Universal graph continual learning,” Transactions on Machine Learning Research, 2023. [Online]. Available: https://openreview.net/forum?id=wzRE5kTnl3
- Y. Liu, R. Qiu, and Z. Huang, “Cat: Balanced continual graph learning with graph condensation,” arXiv preprint arXiv:2309.09455, 2023.
- P. Zhang, Y. Yan, C. Li, S. Wang, X. Xie, G. Song, and S. Kim, “Continual learning on dynamic graphs via parameter isolation,” arXiv preprint arXiv:2305.13825, 2023.
- L. Sun, J. Ye, H. Peng, F. Wang, and S. Y. Philip, “Self-supervised continual graph learning in adaptive riemannian spaces,” in Proceedings of the AAAI Conference on Artificial Intelligence, vol. 37, no. 4, 2023, pp. 4633–4642.
- X. Zhang, D. Song, and D. Tao, “Hierarchical prototype networks for continual graph representation learning,” IEEE Transactions on Pattern Analysis and Machine Intelligence, 2022.
- ——, “Sparsified subgraph memory for continual graph representation learning,” in 2022 IEEE International Conference on Data Mining (ICDM). IEEE, 2022.
- ——, “Ricci curvature-based graph sparsification for continual graph representation learning,” IEEE Transactions on Neural Networks and Learning Systems, 2023.
- X. Zhang, D. Song, Y. Chen, and D. Tao, “Topology-aware embedding memory for learning on expanding networks,” arXiv preprint arXiv:2401.13200, 2024.
- Y. Xu, Y. Zhang, W. Guo, H. Guo, R. Tang, and M. Coates, “Graphsail: Graph structure aware incremental learning for recommender systems,” in Proceedings of the 29th ACM International Conference on Information & Knowledge Management, 2020, pp. 2861–2868.
- J. Cai, X. Wang, C. Guan, Y. Tang, J. Xu, B. Zhong, and W. Zhu, “Multimodal continual graph learning with neural architecture search,” in Proceedings of the ACM Web Conference 2022, 2022, pp. 1292–1300.
- Y. Cui, Y. Wang, Z. Sun, W. Liu, Y. Jiang, K. Han, and W. Hu, “Lifelong embedding learning and transfer for growing knowledge graphs,” in Proceedings of the AAAI Conference on Artificial Intelligence, vol. 37, no. 4, 2023, pp. 4217–4224.
- F. Zhou and C. Cao, “Overcoming catastrophic forgetting in graph neural networks with experience replay,” in Proceedings of the AAAI Conference on Artificial Intelligence, vol. 35, no. 5, 2021, pp. 4714–4722.
- D. Wei, Y. Gu, Y. Song, Z. Song, F. Li, and G. Yu, “Incregnn: Incremental graph neural network learning by considering node and parameter importance,” in Database Systems for Advanced Applications: 27th International Conference, DASFAA 2022, Virtual Event, April 11–14, 2022, Proceedings, Part I. Springer, 2022, pp. 739–746.
- A. Rakaraddi, L. Siew Kei, M. Pratama, and M. De Carvalho, “Reinforced continual learning for graphs,” in Proceedings of the 31st ACM International Conference on Information & Knowledge Management, 2022, pp. 1666–1674.
- J. Wang, W. Zhu, G. Song, and L. Wang, “Streaming graph neural networks with generative replay,” in Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, 2022, pp. 1878–1888.
- K. Ahrabian, Y. Xu, Y. Zhang, J. Wu, Y. Wang, and M. Coates, “Structure aware experience replay for incremental learning in graph-based recommender systems,” in Proceedings of the 30th ACM International Conference on Information & Knowledge Management, 2021, pp. 2832–2836.
- J. Su and C. Wu, “Towards robust inductive graph incremental learning via experience replay,” arXiv preprint arXiv:2302.03534, 2023.
- S. Kim, S. Yun, and J. Kang, “Dygrain: An incremental learning framework for dynamic graphs,” in 31st International Joint Conference on Artificial Intelligence, IJCAI 2022. International Joint Conferences on Artificial Intelligence, 2022, pp. 3157–3163.
- A. Daruna, M. Gupta, M. Sridharan, and S. Chernova, “Continual learning of knowledge graph embeddings,” IEEE Robotics and Automation Letters, vol. 6, no. 2, pp. 1128–1135, 2021.
- J. Kirkpatrick, R. Pascanu, N. Rabinowitz, J. Veness, G. Desjardins, A. A. Rusu, K. Milan, J. Quan, T. Ramalho, A. Grabska-Barwinska et al., “Overcoming catastrophic forgetting in neural networks,” Proceedings of the National Academy of Sciences, vol. 114, no. 13, pp. 3521–3526, 2017.
- R. Aljundi, F. Babiloni, M. Elhoseiny, M. Rohrbach, and T. Tuytelaars, “Memory aware synapses: Learning what (not) to forget,” in Proceedings of the European Conference on Computer Vision (ECCV), 2018, pp. 139–154.
- J. Topping, F. Di Giovanni, B. P. Chamberlain, X. Dong, and M. M. Bronstein, “Understanding over-squashing and bottlenecks on graphs via curvature,” arXiv preprint arXiv:2111.14522, 2021.
- Z. Ye, K. S. Liu, T. Ma, J. Gao, and C. Chen, “Curvature graph network,” in ICLR, 2019.
- R. Pasunuru and M. Bansal, “Continual and multi-task architecture search,” arXiv preprint arXiv:1906.05226, 2019.
- A. Bordes, N. Usunier, A. Garcia-Duran, J. Weston, and O. Yakhnenko, “Translating embeddings for modeling multi-relational data,” Advances in neural information processing systems, vol. 26, 2013.
- D. Lopez-Paz and M. Ranzato, “Gradient episodic memory for continual learning,” in Advances in Neural Information Processing Systems, 2017, pp. 6467–6476.
- A. Gretton, A. Smola, J. Huang, M. Schmittfull, K. Borgwardt, and B. Schölkopf, “Covariate shift by kernel mean matching,” Dataset shift in machine learning, vol. 3, no. 4, p. 5, 2009.
- P. Buzzega, M. Boschini, A. Porrello, D. Abati, and S. Calderara, “Dark experience for general continual learning: a strong, simple baseline,” Advances in neural information processing systems, vol. 33, pp. 15 920–15 930, 2020.
- A. A. Rusu, N. C. Rabinowitz, G. Desjardins, H. Soyer, J. Kirkpatrick, K. Kavukcuoglu, R. Pascanu, and R. Hadsell, “Progressive neural networks,” arXiv preprint arXiv:1606.04671, 2016.
- M. Wortsman, V. Ramanujan, R. Liu, A. Kembhavi, M. Rastegari, J. Yosinski, and A. Farhadi, “Supermasks in superposition,” arXiv preprint arXiv:2006.14769, 2020.
- C. Wang, Y. Qiu, D. Gao, and S. Scherer, “Lifelong graph learning,” in Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, 2022, pp. 13 719–13 728.
- H. Shin, J. K. Lee, J. Kim, and J. Kim, “Continual learning with deep generative replay,” in Advances in Neural Information Processing Systems, 2017, pp. 2990–2999.
- A. Chaudhry, P. K. Dokania, T. Ajanthan, and P. H. Torr, “Riemannian walk for incremental learning: Understanding forgetting and intransigence,” in Proceedings of the European conference on computer vision (ECCV), 2018, pp. 532–547.
- V. P. Dwivedi, C. K. Joshi, T. Laurent, Y. Bengio, and X. Bresson, “Benchmarking graph neural networks,” 2020.
- R. Achanta, A. Shaji, K. Smith, A. Lucchi, P. Fua, and S. Süsstrunk, “Slic superpixels compared to state-of-the-art superpixel methods,” IEEE transactions on pattern analysis and machine intelligence, vol. 34, no. 11, pp. 2274–2282, 2012.
- W. Hu, M. Fey, M. Zitnik, Y. Dong, H. Ren, B. Liu, M. Catasta, and J. Leskovec, “Open graph benchmark: Datasets for machine learning on graphs,” Advances in neural information processing systems, vol. 33, pp. 22 118–22 133, 2020.
- A. K. McCallum, K. Nigam, J. Rennie, and K. Seymore, “Automating the construction of internet portals with machine learning,” Information Retrieval, vol. 3, no. 2, pp. 127–163, 2000.
- W. Hamilton, Z. Ying, and J. Leskovec, “Inductive representation learning on large graphs,” in Advances in Neural Information Processing Systems, 2017, pp. 1024–1034.
- Z. Wu, B. Ramsundar, E. N. Feinberg, J. Gomes, C. Geniesse, A. S. Pappu, K. Leswing, and V. Pande, “Moleculenet: a benchmark for molecular machine learning,” Chemical science, vol. 9, no. 2, pp. 513–530, 2018.
- Z. Xiong, D. Wang, X. Liu, F. Zhong, X. Wan, X. Li, Z. Li, X. Luo, K. Chen, H. Jiang et al., “Pushing the boundaries of molecular representation for drug discovery with the graph attention mechanism,” Journal of medicinal chemistry, vol. 63, no. 16, pp. 8749–8760, 2019.
- P. Sen, G. Namata, M. Bilgic, L. Getoor, B. Galligher, and T. Eliassi-Rad, “Collective classification in network data,” AI Magazine, vol. 29, no. 3, pp. 93–93, 2008.
- O. Shchur, M. Mumme, A. Bojchevski, and S. Günnemann, “Pitfalls of graph neural network evaluation,” arXiv preprint arXiv:1811.05868, 2018.
- J. Tang, J. Sun, C. Wang, and Z. Yang, “Social influence analysis in large-scale networks,” in Proceedings of the 15th ACM SIGKDD International Conference on Knowledge discovery and data mining, 2009, pp. 807–816.
- E. Cho, S. A. Myers, and J. Leskovec, “Friendship and mobility: user movement in location-based social networks,” in Proceedings of the 17th ACM SIGKDD international conference on Knowledge discovery and data mining, 2011, pp. 1082–1090.
- S.-A. Rebuffi, A. Kolesnikov, G. Sperl, and C. H. Lampert, “icarl: Incremental classifier and representation learning,” in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 2017, pp. 2001–2010.
- A. Chaudhry, M. Rohrbach, M. Elhoseiny, T. Ajanthan, P. K. Dokania, P. H. Torr, and M. Ranzato, “On tiny episodic memories in continual learning,” arXiv preprint arXiv:1902.10486, 2019.
- A. Prabhu, P. H. Torr, and P. K. Dokania, “Gdumb: A simple approach that questions our progress in continual learning,” in European conference on computer vision. Springer, 2020, pp. 524–540.
- J. Yoon, E. Yang, J. Lee, and S. J. Hwang, “Lifelong learning with dynamically expandable networks,” arXiv preprint arXiv:1708.01547, 2017.
- J. Su, D. Zou, Z. Zhang, and C. Wu, “Towards robust graph incremental learning on evolving graphs,” in International Conference on Machine Learning. PMLR, 2023, pp. 32 728–32 748.
- J. Jia and A. R. Benson, “Residual correlation in graph neural network regression,” in Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, 2020, pp. 588–598.
- H. Maron, H. Ben-Hamu, H. Serviansky, and Y. Lipman, “Provably powerful graph networks,” Advances in neural information processing systems, vol. 32, 2019.
- L. Akoglu, H. Tong, and D. Koutra, “Graph based anomaly detection and description: a survey,” Data mining and knowledge discovery, vol. 29, pp. 626–688, 2015.
- X. Ma, J. Wu, S. Xue, J. Yang, C. Zhou, Q. Z. Sheng, H. Xiong, and L. Akoglu, “A comprehensive survey on graph anomaly detection with deep learning,” IEEE Transactions on Knowledge and Data Engineering, 2021.
- X. Guo and L. Zhao, “A systematic survey on deep generative models for graph generation,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 45, no. 5, pp. 5370–5390, 2023.
- X. Chang, P. Ren, P. Xu, Z. Li, X. Chen, and A. Hauptmann, “A comprehensive survey of scene graphs: Generation and application,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 45, no. 1, pp. 1–26, 2023.
- J. Xia, Y. Zhu, Y. Du, and S. Z. Li, “A survey of pretraining on graphs: Taxonomy, methods, and applications,” arXiv preprint arXiv:2202.07893, 2022.