Heterogeneous Graph Contrastive Learning with Meta-path Contexts and Adaptively Weighted Negative Samples (2212.13847v3)
Abstract: Heterogeneous graph contrastive learning has received wide attention recently. Some existing methods use meta-paths, which are sequences of object types that capture semantic relationships between objects, to construct contrastive views. However, most of them ignore the rich meta-path context information that describes how two objects are connected by meta-paths. Further, they fail to distinguish negative samples, which could adversely affect the model performance. To address the problems, we propose MEOW, which considers both meta-path contexts and weighted negative samples. Specifically, MEOW constructs a coarse view and a fine-grained view for contrast. The former reflects which objects are connected by meta-paths, while the latter uses meta-path contexts and characterizes details on how the objects are connected. Then, we theoretically analyze the InfoNCE loss and recognize its limitations for computing gradients of negative samples. To better distinguish negative samples, we learn hard-valued weights for them based on node clustering and use prototypical contrastive learning to pull close embeddings of nodes in the same cluster. In addition, we propose a variant model AdaMEOW that adaptively learns soft-valued weights of negative samples to further improve node representation. Finally, we conduct extensive experiments to show the superiority of MEOW and AdaMEOW against other state-of-the-art methods.
- X. Li et al., “Leveraging meta-path contexts for classification in heterogeneous information networks,” in ICDE. IEEE, 2021, pp. 912–923.
- X. Wang, H. Ji, C. Shi, B. Wang, Y. Ye, P. Cui, and P. S. Yu, “Heterogeneous graph attention network,” in The world wide web conference, 2019, pp. 2022–2032.
- X. Fu et al., “Magnn: Metapath aggregated graph neural network for heterogeneous graph embedding,” in WebConf 2020, 2020, pp. 2331–2341.
- M. Park, “Cross-view self-supervised learning on heterogeneous graph neural network via bootstrapping,” arXiv preprint arXiv:2201.03340, 2022.
- D. Hwang et al., “Self-supervised auxiliary learning with meta-paths for heterogeneous graphs,” NeurlPS, vol. 33, pp. 10 294–10 305, 2020.
- A. v. d. Oord et al., “Representation learning with contrastive predictive coding,” arXiv preprint arXiv:1807.03748, 2018.
- X. Jiang et al., “Contrastive pre-training of gnns on heterogeneous graphs,” in CIKM, 2021, pp. 803–812.
- J. Zhao, Q. Wen et al., “Multi-view self-supervised heterogeneous graph embedding,” in ECML-PKDD. Springer, 2021, pp. 319–334.
- X. Wang, N. Liu et al., “Self-supervised heterogeneous graph neural network with co-contrastive learning,” in KDD, 2021, pp. 1726–1736.
- J. Zeng and P. Xie, “Contrastive self-supervised learning for graph classification,” in AAAI, vol. 35, no. 12, 2021, pp. 10 824–10 832.
- Y. Zhu et al., “Deep graph contrastive representation learning,” arXiv preprint arXiv:2006.04131, 2020.
- H. Hafidi et al., “Graphcl: Contrastive self-supervised learning of graph representations,” 2020. [Online]. Available: https://arxiv.org/abs/2007.08025
- Y. Zhang et al., “Nscaching: simple and efficient negative sampling for knowledge graph embedding,” in ICDE. IEEE, 2019, pp. 614–625.
- X. Qin et al., “Relation-aware graph attention model with adaptive self-adversarial training,” in AAAI, vol. 35, no. 11, 2021, pp. 9368–9376.
- Y. Zhu, Y. Xu, H. Cui, C. Yang, Q. Liu, and S. Wu, “Structure-aware hard negative mining for heterogeneous graph contrastive learning,” arXiv preprint arXiv:2108.13886, 2021.
- J. Li et al., “Prototypical contrastive learning of unsupervised representations,” arXiv preprint arXiv:2005.04966, 2020.
- C. Zhang, D. Song et al., “Heterogeneous graph neural network,” in KDD, 2019, pp. 793–803.
- Z. Hu et al., “Heterogeneous graph transformer,” in WebConf, 2020, pp. 2704–2710.
- S. Yun, M. Jeong, R. Kim, J. Kang, and H. J. Kim, “Graph transformer networks,” Advances in neural information processing systems, vol. 32, 2019.
- W. Hamilton et al., “Inductive representation learning on large graphs,” NeurlPS, vol. 30, 2017.
- Y. Zhu, Y. Xu et al., “Graph contrastive learning with adaptive augmentation,” in WWW, 2021, pp. 2069–2080.
- J. Qiu et al., “Gcc: Graph contrastive coding for graph neural network pre-training,” in KDD, 2020, pp. 1150–1160.
- N. Jovanović et al., “Towards robust graph contrastive learning,” arXiv preprint arXiv:2102.13085, 2021.
- J. Tang et al., “Line: Large-scale information network embedding,” in WWW, 2015, pp. 1067–1077.
- S. Suresh et al., “Adversarial graph augmentation to improve graph contrastive learning,” NeurlPS, vol. 34, pp. 15 920–15 933, 2021.
- K. Hassani and A. H. Khasahmadi, “Contrastive multi-view representation learning on graphs,” in ICML. PMLR, 2020, pp. 4116–4126.
- C. Park et al., “Unsupervised attributed multiplex network embedding,” in AAAI, 2020, pp. 5371–5378.
- T. Chen et al., “A simple framework for contrastive learning of visual representations,” in ICML. PMLR, 2020, pp. 1597–1607.
- Y. You et al., “Graph contrastive learning with augmentations,” NeurlPS, vol. 33, pp. 5812–5823, 2020.
- P. Velickovic et al., “Deep graph infomax.” ICLR (Poster), vol. 2, no. 3, p. 4, 2019.
- Y. Ren et al., “Heterogeneous deep graph infomax,” arXiv preprint arXiv:1911.08538, 2019.
- J. Robinson et al., “Contrastive learning with hard negative samples,” arXiv preprint arXiv:2010.04592, 2020.
- H. Zhang et al., “mixup: Beyond empirical risk minimization,” arXiv preprint arXiv:1710.09412, 2017.
- Y. Kalantidis et al., “Hard negative mixing for contrastive learning,” NeurIPS, pp. 21 798–21 809, 2020.
- F. Che et al., “Multi-aspect self-supervised learning for heterogeneous information network,” Knowledge-Based Systems, vol. 233, p. 107474, 2021.
- Y. Sun et al., “Pathsim: Meta path-based top-k similarity search in heterogeneous information networks,” PVLDB, vol. 4, no. 11, pp. 992–1003, 2011.
- J. Zhao et al., “Network schema preserving heterogeneous information network embedding,” in IJCAI, 2020.
- B. Hu et al., “Adversarial learning on heterogeneous information networks,” in KDD, 2019, pp. 120–129.
- L. Luo, Y. Fang, X. Cao, X. Zhang, and W. Zhang, “Detecting communities from heterogeneous graphs: A context path-based graph neural network model,” in CIKM, 2021, pp. 1170–1180.
- T. N. Kipf and M. Welling, “Variational graph auto-encoders,” arXiv preprint arXiv:1611.07308, 2016.
- C. Shi et al., “Heterogeneous information network embedding for recommendation,” TKDE, vol. 31, no. 2, pp. 357–370, 2018.
- C. Zhang et al., “Heterogeneous graph neural network,” in KDD, 2019, pp. 793–803.
- Y. Dong et al., “metapath2vec: Scalable representation learning for heterogeneous networks,” in KDD, 2017, pp. 135–144.
- Jianxiang Yu (16 papers)
- Qingqing Ge (4 papers)
- Xiang Li (1003 papers)
- Aoying Zhou (26 papers)