Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Addressing the Impact of Localized Training Data in Graph Neural Networks (2307.12689v2)

Published 24 Jul 2023 in cs.LG and cs.AI

Abstract: Graph Neural Networks (GNNs) have achieved notable success in learning from graph-structured data, owing to their ability to capture intricate dependencies and relationships between nodes. They excel in various applications, including semi-supervised node classification, link prediction, and graph generation. However, it is important to acknowledge that the majority of state-of-the-art GNN models are built upon the assumption of an in-distribution setting, which hinders their performance on real-world graphs with dynamic structures. In this article, we aim to assess the impact of training GNNs on localized subsets of the graph. Such restricted training data may lead to a model that performs well in the specific region it was trained on but fails to generalize and make accurate predictions for the entire graph. In the context of graph-based semi-supervised learning (SSL), resource constraints often lead to scenarios where the dataset is large, but only a portion of it can be labeled, affecting the model's performance. This limitation affects tasks like anomaly detection or spam detection when labeling processes are biased or influenced by human subjectivity. To tackle the challenges posed by localized training data, we approach the problem as an out-of-distribution (OOD) data issue by by aligning the distributions between the training data, which represents a small portion of labeled data, and the graph inference process that involves making predictions for the entire graph. We propose a regularization method to minimize distributional discrepancies between localized training data and graph inference, improving model performance on OOD data. Extensive tests on popular GNN models show significant performance improvement on three citation GNN benchmark datasets. The regularization approach effectively enhances model adaptation and generalization, overcoming challenges posed by OOD data.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (41)
  1. A. Davies and N. Ajmeri, “Realistic synthetic social networks with graph neural networks,” arXiv preprint arXiv:2212.07843, 2022.
  2. W. Fan, Y. Ma, Q. Li, Y. He, E. Zhao, J. Tang, and D. Yin, “Graph neural networks for social recommendation,” in The world wide web conference, 2019, pp. 417–426.
  3. T. Gaudelet, B. Day, A. R. Jamasb, J. Soman, C. Regep, G. Liu, J. B. Hayter, R. Vickers, C. Roberts, J. Tang et al., “Utilizing graph machine learning within drug discovery and development,” Briefings in bioinformatics, vol. 22, no. 6, p. bbab159, 2021.
  4. O. Wieder, S. Kohlbacher, M. Kuenemann, A. Garon, P. Ducrot, T. Seidel, and T. Langer, “A compact review of molecular property prediction with graph neural networks,” Drug Discovery Today: Technologies, vol. 37, pp. 1–12, 2020.
  5. C. Gao, X. Wang, X. He, and Y. Li, “Graph neural networks for recommender system,” in Proceedings of the Fifteenth ACM International Conference on Web Search and Data Mining, 2022, pp. 1623–1625.
  6. Y. Chu, J. Yao, C. Zhou, and H. Yang, “Graph neural networks in modern recommender systems,” Graph Neural Networks: Foundations, Frontiers, and Applications, pp. 423–445, 2022.
  7. Y. Wang, Y. Zhao, Y. Zhang, and T. Derr, “Collaboration-aware graph convolutional networks for recommendation systems,” arXiv preprint arXiv:2207.06221, 2022.
  8. Z. Ye, Y. J. Kumar, G. O. Sing, F. Song, and J. Wang, “A comprehensive survey of graph neural networks for knowledge graphs,” IEEE Access, vol. 10, pp. 75 729–75 741, 2022.
  9. M. Yasunaga, H. Ren, A. Bosselut, P. Liang, and J. Leskovec, “Qa-gnn: Reasoning with language models and knowledge graphs for question answering,” arXiv preprint arXiv:2104.06378, 2021.
  10. Y. Zhang and Q. Yao, “Knowledge graph reasoning with relational digraph,” in Proceedings of the ACM Web Conference 2022, 2022, pp. 912–924.
  11. S. Xiao, S. Wang, Y. Dai, and W. Guo, “Graph neural networks in node classification: survey and evaluation,” Machine Vision and Applications, vol. 33, pp. 1–19, 2022.
  12. K. Oono and T. Suzuki, “Graph neural networks exponentially lose expressive power for node classification,” arXiv preprint arXiv:1905.10947, 2019.
  13. T. N. Kipf and M. Welling, “Semi-supervised classification with graph convolutional networks,” arXiv preprint arXiv:1609.02907, 2016.
  14. P. Veličković, G. Cucurull, A. Casanova, A. Romero, P. Lio, and Y. Bengio, “Graph attention networks,” arXiv preprint arXiv:1710.10903, 2017.
  15. W. Hamilton, Z. Ying, and J. Leskovec, “Inductive representation learning on large graphs,” Advances in neural information processing systems, vol. 30, 2017.
  16. A. Wijesinghe and Q. Wang, “Dfnets: Spectral cnns for graphs with feedback-looped filters,” in Advances in Neural Information Processing Systems (NeurIPS), 2019.
  17. S. Yang, Z. Zhang, J. Zhou, Y. Wang, W. Sun, X. Zhong, Y. Fang, Q. Yu, and Y. Qi, “Financial risk analysis for smes with graph-based supply chain mining,” in Proceedings of the Twenty-Ninth International Conference on International Joint Conferences on Artificial Intelligence, 2021, pp. 4661–4667.
  18. W. Hu, M. Fey, M. Zitnik, Y. Dong, H. Ren, B. Liu, M. Catasta, and J. Leskovec, “Open graph benchmark: Datasets for machine learning on graphs,” Advances in neural information processing systems, vol. 33, pp. 22 118–22 133, 2020.
  19. M. Liang, B. Yang, R. Hu, Y. Chen, R. Liao, S. Feng, and R. Urtasun, “Learning lane graph representations for motion forecasting,” in Computer Vision–ECCV 2020: 16th European Conference, Glasgow, UK, August 23–28, 2020, Proceedings, Part II 16.   Springer, 2020, pp. 541–556.
  20. C. Agarwal, H. Lakkaraju, and M. Zitnik, “Towards a unified framework for fair and stable graph representation learning,” in Uncertainty in Artificial Intelligence.   PMLR, 2021, pp. 2114–2124.
  21. G. Panagopoulos, G. Nikolentzos, and M. Vazirgiannis, “Transfer graph neural networks for pandemic forecasting,” in Proceedings of the AAAI Conference on Artificial Intelligence, vol. 35, no. 6, 2021, pp. 4838–4845.
  22. J. Yu, J. Liang, and R. He, “Mind the label shift of augmentation-based graph ood generalization,” in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2023, pp. 11 620–11 630.
  23. Y. Sui, X. Wang, J. Wu, A. Zhang, and X. He, “Adversarial causal augmentation for graph covariate shift,” arXiv preprint arXiv:2211.02843, 2022.
  24. W. Feng, J. Zhang, Y. Dong, Y. Han, H. Luan, Q. Xu, Q. Yang, E. Kharlamov, and J. Tang, “Graph random neural networks for semi-supervised learning on graphs,” Advances in neural information processing systems, vol. 33, pp. 22 092–22 103, 2020.
  25. H. Li, Z. Zhang, X. Wang, and W. Zhu, “Disentangled graph contrastive learning with independence promotion,” IEEE Transactions on Knowledge and Data Engineering, 2022.
  26. S. Fan, X. Wang, Y. Mo, C. Shi, and J. Tang, “Debiasing graph neural networks via learning disentangled causal substructure,” arXiv preprint arXiv:2209.14107, 2022.
  27. H. Li, X. Wang, Z. Zhang, Z. Yuan, H. Li, and W. Zhu, “Disentangled contrastive learning on graphs,” Advances in Neural Information Processing Systems, vol. 34, pp. 21 872–21 884, 2021.
  28. Y. Zhou, G. Kutyniok, and B. Ribeiro, “Ood link prediction generalization capabilities of message-passing gnns in larger test graphs,” arXiv preprint arXiv:2205.15117, 2022.
  29. H. Li, Z. Zhang, X. Wang, and W. Zhu, “Learning invariant graph representations for out-of-distribution generalization,” in Advances in Neural Information Processing Systems, 2022.
  30. D. Buffelli, P. Liò, and F. Vandin, “Sizeshiftreg: a regularization method for improving size-generalization in graph neural networks,” arXiv preprint arXiv:2207.07888, 2022.
  31. Y. Wu, A. Bojchevski, and H. Huang, “Adversarial weight perturbation improves generalization in graph neural network,” arXiv preprint arXiv:2212.04983, 2022.
  32. A. Sadeghi, M. Ma, B. Li, and G. B. Giannakis, “Distributionally robust semi-supervised learning over graphs,” arXiv preprint arXiv:2110.10582, 2021.
  33. F. Feng, X. He, J. Tang, and T.-S. Chua, “Graph adversarial training: Dynamically regularizing based on graph structure,” IEEE Transactions on Knowledge and Data Engineering, vol. 33, no. 6, pp. 2493–2504, 2019.
  34. Y. Wang, C. Li, W. Jin, R. Li, J. Zhao, J. Tang, and X. Xie, “Test-time training for graph neural networks,” arXiv preprint arXiv:2210.08813, 2022.
  35. H. Liu, B. Hu, X. Wang, C. Shi, Z. Zhang, and J. Zhou, “Confidence may cheat: Self-training on graph neural networks under distribution shift,” in Proceedings of the ACM Web Conference 2022, 2022, pp. 1248–1258.
  36. Y. Liu, M. Jin, S. Pan, C. Zhou, Y. Zheng, F. Xia, and S. Y. Philip, “Graph self-supervised learning: A survey,” IEEE Transactions on Knowledge and Data Engineering, vol. 35, no. 6, pp. 5879–5900, 2022.
  37. Z. Liu, C. Chen, X. Yang, J. Zhou, X. Li, and L. Song, “Heterogeneous graph neural networks for malicious account detection,” in Proceedings of the 27th ACM international conference on information and knowledge management, 2018, pp. 2077–2085.
  38. D. Wang, J. Lin, P. Cui, Q. Jia, Z. Wang, Y. Fang, Q. Yu, J. Zhou, S. Yang, and Y. Qi, “A semi-supervised graph attentive network for financial fraud detection,” in 2019 IEEE International Conference on Data Mining (ICDM).   IEEE, 2019, pp. 598–607.
  39. Q. Zhu, N. Ponomareva, J. Han, and B. Perozzi, “Shift-robust gnns: Overcoming the limitations of localized graph training data,” Advances in Neural Information Processing Systems, vol. 34, 2021.
  40. J. Gasteiger, A. Bojchevski, and S. Günnemann, “Predict then propagate: Graph neural networks meet personalized pagerank,” in International Conference on Learning Representations (ICLR), 2019.
  41. M. Liu, H. Gao, and S. Ji, “Towards deeper graph neural networks,” in Proceedings of the 26th ACM SIGKDD international conference on knowledge discovery & data mining, 2020, pp. 338–348.
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (1)
  1. Akansha A (1 paper)
Citations (4)

Summary

We haven't generated a summary for this paper yet.