Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
173 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Mitigating Degree Biases in Message Passing Mechanism by Utilizing Community Structures (2312.16788v1)

Published 28 Dec 2023 in cs.LG, cs.AI, and cs.SI

Abstract: This study utilizes community structures to address node degree biases in message-passing (MP) via learnable graph augmentations and novel graph transformers. Recent augmentation-based methods showed that MP neural networks often perform poorly on low-degree nodes, leading to degree biases due to a lack of messages reaching low-degree nodes. Despite their success, most methods use heuristic or uniform random augmentations, which are non-differentiable and may not always generate valuable edges for learning representations. In this paper, we propose Community-aware Graph Transformers, namely CGT, to learn degree-unbiased representations based on learnable augmentations and graph transformers by extracting within community structures. We first design a learnable graph augmentation to generate more within-community edges connecting low-degree nodes through edge perturbation. Second, we propose an improved self-attention to learn underlying proximity and the roles of nodes within the community. Third, we propose a self-supervised learning task that could learn the representations to preserve the global graph structure and regularize the graph augmentations. Extensive experiments on various benchmark datasets showed CGT outperforms state-of-the-art baselines and significantly improves the node degree biases. The source code is available at https://github.com/NSLab-CUK/Community-aware-Graph-Transformer.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (31)
  1. Graph representation learning and its applications: A survey. Sensors, 23(8), 2023.
  2. A generalization of transformer networks to graphs. In Proceedings of the AAAI Workshop on Deep Learning on Graphs (AAAI 2021), 8–9 Feb 2021.
  3. Uncovering the structural fairness in graph contrastive learning. In Proceedings of the 35th Annual Conference on Neural Information Processing Systems (NeurIPS 2022), New Orleans, Louisiana, 28th Nov- 9th Dec 2022.
  4. Relational self-supervised learning on graphs. In In Proceedings of the 31st ACM International Conference on Information & Knowledge Management (CKIM 2022), Atlanta, GA, USA, 17-21 October, 2022, pages 1054–1063. ACM, 2022.
  5. Investigating and mitigating degree-related biases in graph convoltuional networks. In The 29th ACM International Conference on Information and Knowledge Management (CIKM 2020), pages 1435–1444. ACM, Oct 2020.
  6. Hub-hub connections matter: Improving edge dropout to relieve over-smoothing in graph neural networks. Knowledge-Based Systems, 270:110556, 2023.
  7. Deep Graph Contrastive Representation Learning. In ICML Workshop on Graph Representation Learning and Beyond, 2020.
  8. Learning fair graph representations via automated data augmentations. In The 11th International Conference on Learning Representations (ICLR 2023), Kigali, Rwanda, 1-5 May, 2023 2023. OpenReview.net.
  9. Rethinking graph transformers with spectral attention. In Proceedings of the 34th Annual Conference on Neural Information Processing Systems (NeurIPS 2021), pages 21618–21629, Virtual Event, Dec 2021.
  10. Transitivity-preserving graph representation learning for bridging local connectivity and role-based similarity. CoRR, abs/2308.09517, 2023.
  11. Structure-aware transformer for graph representation learning. In Proceedings of the International Conference on Machine Learning (ICML 2022), pages 3469–3489, Baltimore, Maryland, USA, 17–23 Jul 2022. PMLR.
  12. Story embedding: Learning distributed representations of stories based on character networks. Artificial Intelligence, 281:103235, 2020.
  13. Day-ahead hourly solar irradiance forecasting based on multi-attributed spatio-temporal graph convolutional network. Sensors, 22(19):7179, 2022.
  14. Learning multi-resolution representations of research patterns in bibliographic networks. Journal of Informetrics, 15(1):101126, 2021.
  15. Companion animal disease diagnostics based on literal-aware medical knowledge graph representation learning. IEEE Access, 11:114238–114249, 2023.
  16. Plot structure decomposition in narrative multimedia by analyzing personalities of fictional characters. Applied Sciences, 11(4):1645, 2021.
  17. Graph contrastive learning with augmentations. In Proceedings of the 33rd Annual Conference on Neural Information Processing Systems (NeurIPS 2020), virtual, 6-12 Dec, 2020 2020.
  18. Graph contrastive learning with adaptive augmentation. In Proceedings of the 30th The Web Conference 2021 (WWW 2021), pages 2069–2080, Ljubljana, Slovenia, 19-23 April, 2021 2021. ACM / IW3C2.
  19. Mixhop: Higher-order graph convolutional architectures via sparsified neighborhood mixing. In Proceedings of the 36th International Conference on Machine Learning, (ICML 2019), pages 21–29, Long Beach, California, USA, 9-15 June 2019. PMLR.
  20. Categorical reparameterization with gumbel-softmax. In Proceedings of the 5th International Conference on Learning Representations 2017 (ICLR 2017), Toulon, France, 24 - 26 April, 2017 2017.
  21. The concrete distribution: A continuous relaxation of discrete random variables. In Proceedings of the 5th International Conference on Learning Representations 2017 (ICLR 2017), Toulon, France, 24 - 26 April, 2017 2017.
  22. Noise-contrastive estimation of unnormalized statistical models, with applications to natural image statistics. Journal of Machine Learning Research, 13:307–361, 2012.
  23. Graph clustering with graph neural networks. Journal of Machine Learning Research, 24:127:1–127:21, 2023.
  24. Collective classification in network data. AI Magazine, 29(3):93–106, 2008.
  25. Image-based recommendations on styles and substitutes. In Proceedings of the 38th International ACM SIGIR Conference on Research and Development in Information Retrieval (SIGIR 2015), pages 43–52, Santiago, Chile, 9-13 Aug, 2015 2015. ACM.
  26. Wiki-cs: A wikipedia-based benchmark for graph neural networks. CoRR, abs/2007.02901, 2020.
  27. Semi-supervised classification with graph convolutional networks. In Proceedings of the 5th International Conference on Learning Representations (ICLR 2017), Toulon, France, 24 - 26 April, 2017. OpenReview.net.
  28. How powerful are graph neural networks? In Proceedings of the 7th International Conference on Learning Representations (ICLR 2019), New Orleans, LA, USA, May 2019. OpenReview.net.
  29. Graph attention networks. In Proceedings of the 6th International Conference on Learning Representations (ICLR 2018), Vancouver, BC, Canada, Apr 2018. OpenReview.net.
  30. Inductive representation learning on large graphs. In Proceedings of the 30th Annual Conference on Neural Information Processing Systems (NeurIPS 2017), pages 1024–1034, Long Beach, CA, USA, Dec 2017.
  31. Recipe for a general, powerful, scalable graph transformer. In Proceedings of the 35th Annual Conference on Neural Information Processing Systems (NeurIPS 2022), 2022.
Citations (3)

Summary

We haven't generated a summary for this paper yet.

Youtube Logo Streamline Icon: https://streamlinehq.com