Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Mitigating Oversmoothing Through Reverse Process of GNNs for Heterophilic Graphs (2403.10543v2)

Published 11 Mar 2024 in cs.SI and cs.LG

Abstract: Graph Neural Network (GNN) resembles the diffusion process, leading to the over-smoothing of learned representations when stacking many layers. Hence, the reverse process of message passing can produce the distinguishable node representations by inverting the forward message propagation. The distinguishable representations can help us to better classify neighboring nodes with different labels, such as in heterophilic graphs. In this work, we apply the design principle of the reverse process to the three variants of the GNNs. Through the experiments on heterophilic graph data, where adjacent nodes need to have different representations for successful classification, we show that the reverse process significantly improves the prediction performance in many cases. Additional analysis reveals that the reverse mechanism can mitigate the over-smoothing over hundreds of layers. Our code is available at https://github.com/ml-postech/reverse-gnn.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (30)
  1. Mixhop: Higher-order graph convolutional architectures via sparsified neighborhood mixing. In international conference on machine learning, pp.  21–29. PMLR, 2019.
  2. Half-hop: a graph upsampling approach for slowing down message passing. In Proceedings of the 40th International Conference on Machine Learning, ICML’23. JMLR.org, 2023.
  3. Invertible residual networks. In Chaudhuri, K. and Salakhutdinov, R. (eds.), Proceedings of the 36th International Conference on Machine Learning, volume 97 of Proceedings of Machine Learning Research, pp.  573–582. PMLR, 09–15 Jun 2019.
  4. Beyond low-frequency information in graph convolutional networks. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 35, pp.  3950–3957, 2021.
  5. Grand: Graph neural diffusion. In Meila, M. and Zhang, T. (eds.), Proceedings of the 38th International Conference on Machine Learning, volume 139 of Proceedings of Machine Learning Research, pp.  1407–1418. PMLR, 18–24 Jul 2021.
  6. Neural ordinary differential equations. In Bengio, S., Wallach, H., Larochelle, H., Grauman, K., Cesa-Bianchi, N., and Garnett, R. (eds.), Advances in Neural Information Processing Systems, volume 31. Curran Associates, Inc., 2018. URL https://proceedings.neurips.cc/paper_files/paper/2018/file/69386f6bb1dfed68692a24c8686939b9-Paper.pdf.
  7. Adaptive universal generalized pagerank graph neural network. arXiv preprint arXiv:2006.07988, 2020.
  8. Convolutional neural networks on graphs with fast localized spectral filtering. Advances in neural information processing systems, 29, 2016.
  9. Gbk-gnn: Gated bi-kernel graph neural networks for modeling both homophily and heterophily. In Proceedings of the ACM Web Conference 2022, pp.  1550–1558, 2022.
  10. Predict then propagate: Graph neural networks meet personalized pagerank. arXiv preprint arXiv:1810.05997, 2018.
  11. Inductive representation learning on large graphs. Advances in neural information processing systems, 30, 2017.
  12. Deep residual learning for image recognition. In 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp.  770–778, Los Alamitos, CA, USA, jun 2016. IEEE Computer Society. URL https://doi.ieeecomputersociety.org/10.1109/CVPR.2016.90.
  13. Bernnet: Learning arbitrary graph spectral filters via bernstein approximation. In Beygelzimer, A., Dauphin, Y., Liang, P., and Vaughan, J. W. (eds.), Advances in Neural Information Processing Systems, 2021. URL https://openreview.net/forum?id=WigDnV-_Gq.
  14. Semi-supervised classification with graph convolutional networks. In International Conference on Learning Representations, 2017. URL https://openreview.net/forum?id=SJU4ayYgl.
  15. Finding global homophily in graph neural networks when meeting heterophily. In International Conference on Machine Learning, pp.  13242–13256. PMLR, 2022.
  16. Predicting global label relationship matrix for graph neural networks under heterophily. In Thirty-seventh Conference on Neural Information Processing Systems, 2023.
  17. Simplifying approach to node classification in graph neural networks. Journal of Computational Science, 62:101695, 2022.
  18. Characterizing graph datasets for node classification: Homophily-heterophily dichotomy and beyond. In The Second Learning on Graphs Conference, 2023a.
  19. A critical look at evaluation of gnns under heterophily: Are we really making progress? In The Eleventh International Conference on Learning Representations, 2023b.
  20. Dropedge: Towards deep graph convolutional networks on node classification. In International Conference on Learning Representations, 2020. URL https://openreview.net/forum?id=Hkx1qkrKPr.
  21. Masked label prediction: Unified message passing model for semi-supervised classification. arXiv preprint arXiv:2009.03509, 2020.
  22. Ordered gnn: Ordering message passing to deal with heterophily and over-smoothing. arXiv preprint arXiv:2302.01524, 2023.
  23. Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In The Thirty-Fourth AAAI Conference on Artificial Intelligence, AAAI 2020, The Thirty-Second Innovative Applications of Artificial Intelligence Conference, IAAI 2020, The Tenth AAAI Symposium on Educational Advances in Artificial Intelligence, EAAI 2020, New York, NY, USA, February 7-12, 2020, 2020. URL https://doi.org/10.1609/aaai.v34i04.6048.
  24. Grand++: Graph neural diffusion with a source term. In International Conference on Learning Representation (ICLR), 2022.
  25. Graph attention networks. In International Conference on Learning Representations, 2018. URL https://openreview.net/forum?id=rJXMpikCZ.
  26. How powerful are spectral graph neural networks. In International Conference on Machine Learning, pp.  23341–23362. PMLR, 2022.
  27. Model degradation hinders deep graph neural networks. In Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, 2022. URL https://doi.org/10.1145/3534678.3539374.
  28. Pairnorm: Tackling oversmoothing in gnns. In International Conference on Learning Representations, 2020. URL https://openreview.net/forum?id=rkecl1rtwB.
  29. Beyond homophily in graph neural networks: Current limitations and effective designs. Advances in neural information processing systems, 33:7793–7804, 2020.
  30. Graph neural networks with heterophily. In Proceedings of the AAAI conference on artificial intelligence, volume 35, pp.  11168–11176, 2021.
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. MoonJeong Park (8 papers)
  2. Jaeseung Heo (5 papers)
  3. Dongwoo Kim (63 papers)
Citations (1)

Summary

We haven't generated a summary for this paper yet.