Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
184 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

HomoGCL: Rethinking Homophily in Graph Contrastive Learning (2306.09614v1)

Published 16 Jun 2023 in cs.LG and cs.SI

Abstract: Contrastive learning (CL) has become the de-facto learning paradigm in self-supervised learning on graphs, which generally follows the "augmenting-contrasting" learning scheme. However, we observe that unlike CL in computer vision domain, CL in graph domain performs decently even without augmentation. We conduct a systematic analysis of this phenomenon and argue that homophily, i.e., the principle that "like attracts like", plays a key role in the success of graph CL. Inspired to leverage this property explicitly, we propose HomoGCL, a model-agnostic framework to expand the positive set using neighbor nodes with neighbor-specific significances. Theoretically, HomoGCL introduces a stricter lower bound of the mutual information between raw node features and node embeddings in augmented views. Furthermore, HomoGCL can be combined with existing graph CL models in a plug-and-play way with light extra computational overhead. Extensive experiments demonstrate that HomoGCL yields multiple state-of-the-art results across six public datasets and consistently brings notable performance improvements when applied to various graph CL methods. Code is avilable at https://github.com/wenzhilics/HomoGCL.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (53)
  1. Learning Representations by Maximizing Mutual Information Across Views. In NeurIPS. 15509–15519.
  2. Graph Barlow Twins: A self-supervised representation learning framework for graphs. Knowl. Based Syst. 256 (2022), 109631.
  3. Christopher M Bishop and Nasser M Nasrabadi. 2006. Pattern recognition and machine learning.
  4. A Simple Framework for Contrastive Learning of Visual Representations. In ICML. 1597–1607.
  5. Thomas M. Cover and Joy A. Thomas. 2006. Elements of Information Theory (Second Edition). Wiley-Interscience, USA.
  6. Adversarial Graph Contrastive Learning with Information Regularization. In WWW. 1362–1371.
  7. Bootstrap Your Own Latent - A New Approach to Self-Supervised Learning. In NeurIPS.
  8. Aditya Grover and Jure Leskovec. 2016. node2vec: Scalable Feature Learning for Networks. In KDD. 855–864.
  9. Inductive Representation Learning on Large Graphs. In NeurIPS. 1024–1034.
  10. Kaveh Hassani and Amir Hosein Khas Ahmadi. 2020. Contrastive Multi-View Representation Learning on Graphs. In ICML. 4116–4126.
  11. Momentum Contrast for Unsupervised Visual Representation Learning. In CVPR. 9726–9735.
  12. Open Graph Benchmark: Datasets for Machine Learning on Graphs. In NeurIPS.
  13. Automated Self-Supervised Learning for Graphs. In ICLR.
  14. HDMI: High-order Deep Multiplex Infomax. In WWW. 2414–2424.
  15. Thomas N Kipf and Max Welling. 2016. Variational Graph Auto-Encoders. NIPS Workshop on Bayesian Deep Learning (2016).
  16. Thomas N. Kipf and Max Welling. 2017. Semi-Supervised Classification with Graph Convolutional Networks. In ICLR.
  17. Augmentation-Free Self-Supervised Learning on Graphs. In AAAI. 7372–7380.
  18. Graph Communal Contrastive Learning. In WWW. 1203–1213.
  19. Large Scale Learning on Non-Homophilous Graphs: New Benchmarks and Strong Simple Methods. NeurIPS (2021).
  20. Birds of a feather: Homophily in social networks. Annual review of sociology (2001), 415–444.
  21. Vinod Nair and Geoffrey E. Hinton. 2010. Rectified Linear Units Improve Restricted Boltzmann Machines. In ICML. 807–814.
  22. CGC: Contrastive Graph Clustering forCommunity Detection and Tracking. In WWW. 1115–1126.
  23. Geom-GCN: Geometric Graph Convolutional Networks. In ICLR.
  24. Graph Representation Learning via Graphical Mutual Information Maximization. In WWW. 259–270.
  25. Deepwalk: Online learning of social representations. In KDD. 701–710.
  26. On Variational Bounds of Mutual Information. In ICML, Vol. 97. 5171–5180.
  27. GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training. In KDD. 1150–1160.
  28. William M Rand. 1971. Objective criteria for the evaluation of clustering methods. Journal of the American Statistical association 66, 336 (1971), 846–850.
  29. Pitfalls of Graph Neural Network Evaluation. In Relational Representation Learning Workshop@NeurIPS.
  30. Large-Scale Representation Learning on Graphs via Bootstrapping. In ICLR.
  31. Contrastive Multiview Coding. In ECCV. 776–794.
  32. Augmentations in Graph Contrastive Learning: Current Methodological Flaws & Towards Better Practices. In WWW. 1538–1549.
  33. On Mutual Information Maximization for Representation Learning. In ICLR.
  34. Representation Learning with Contrastive Predictive Coding. arXiv preprint arXiv:1807.03748 (2018).
  35. Laurens Van der Maaten and Geoffrey Hinton. 2008. Visualizing data using t-SNE. J. Mach. Learn. Res. 9, 11 (2008).
  36. Graph Attention Networks. In ICLR.
  37. Deep Graph Infomax. ICLR (2019).
  38. Microsoft Academic Graph: When experts are not enough. Quant. Sci. Stud. 1, 1 (2020), 396–413.
  39. ClusterSCL: Cluster-Aware Supervised Contrastive Learning on Graphs. In WWW. 1611–1621.
  40. Simplifying Graph Convolutional Networks. In ICML. 6861–6871.
  41. SimGRACE: A Simple Framework for Graph Contrastive Learning without Data Augmentation. In WWW. 1070–1079.
  42. ProGCL: Rethinking Hard Negative Mining in Graph Contrastive Learning. In ICML. 24332–24346.
  43. InfoGCL: Information-Aware Graph Contrastive Learning. In NeurIPS. 30414–30425.
  44. How Powerful are Graph Neural Networks?. In ICLR.
  45. Revisiting Semi-Supervised Learning with Graph Embeddings. In ICML. 40–48.
  46. AutoGCL: Automated Graph Contrastive Learning via Learnable View Generators. In AAAI. 8892–8900.
  47. Graph Contrastive Learning Automated. In ICML. 12121–12132.
  48. Graph Contrastive Learning with Augmentations. In NeurIPS. 5812–5823.
  49. From Canonical Correlation Analysis to Self-supervised Graph Neural Networks. In NeurIPS. 76–89.
  50. COSTA: Covariance-Preserving Feature Augmentation for Graph Contrastive Learning. In KDD. 2524–2534.
  51. Beyond Homophily in Graph Neural Networks: Current Limitations and Effective Designs. In NeurIPS.
  52. Deep Graph Contrastive Representation Learning. arXiv preprint arXiv:2006.04131 (2020).
  53. Graph Contrastive Learning with Adaptive Augmentation. In WWW. 2069–2080.
Citations (16)

Summary

We haven't generated a summary for this paper yet.

Github Logo Streamline Icon: https://streamlinehq.com