Multi-Scale Subgraph Contrastive Learning (2403.02719v3)
Abstract: Graph-level contrastive learning, aiming to learn the representations for each graph by contrasting two augmented graphs, has attracted considerable attention. Previous studies usually simply assume that a graph and its augmented graph as a positive pair, otherwise as a negative pair. However, it is well known that graph structure is always complex and multi-scale, which gives rise to a fundamental question: after graph augmentation, will the previous assumption still hold in reality? By an experimental analysis, we discover the semantic information of an augmented graph structure may be not consistent as original graph structure, and whether two augmented graphs are positive or negative pairs is highly related with the multi-scale structures. Based on this finding, we propose a multi-scale subgraph contrastive learning architecture which is able to characterize the fine-grained semantic information. Specifically, we generate global and local views at different scales based on subgraph sampling, and construct multiple contrastive relationships according to their semantic associations to provide richer self-supervised signals. Extensive experiments and parametric analyzes on eight graph classification real-world datasets well demonstrate the effectiveness of the proposed method.
- Sub2vec: Feature learning for subgraphs. In PAKDD, pages 170–182, 2018.
- On the surprising behavior of distance metrics in high dimensional space. In International Conference on Database Theory, pages 420–434, 2001.
- Graph networks as a universal machine learning framework for molecules and crystals. Chemistry of Materials, 31(9):3564–3572, 2019.
- Are powerful graph neural nets necessary? a dissection on graph classification. arXiv preprint arXiv:1905.04579, 2019.
- A simple framework for contrastive learning of visual representations. In ICML, pages 1597–1607, 2020.
- Cuco: Graph representation with curriculum contrastive learning. In IJCAI, pages 2300–2306, 2021.
- Neural message passing for quantum chemistry. In ICML, pages 1263–1272, 2017.
- node2vec: Scalable feature learning for networks. In KDD, pages 855–864, 2016.
- Contrastive multi-view representation learning on graphs. In ICML, pages 4116–4126, 2020.
- Strategies for pre-training graph neural networks. In ICLR, 2020.
- Computational predictions of energy materials using density functional theory. Nature Reviews Materials, 1(1):1–13, 2016.
- Aptrank: an adaptive pagerank model for protein function prediction on bi-relational graphs. Bioinformatics, 33(12):1829–1836, 2017.
- Sub-graph contrast for scalable self-supervised graph representation learning. In ICDM, pages 222–231, 2020.
- Multi-scale contrastive siamese networks for self-supervised graph representation learning. In IJCAI, 2021.
- A method of internet-analysis by the tools of graph theory. In Intelligent Decision Technologies, pages 35–44, 2012.
- Adam: A method for stochastic optimization. In ICLR, 2014.
- Variational graph auto-encoders. In NeurIPS, 2016.
- Tudataset: A collection of benchmark datasets for learning with graphs. arXiv:2007.08663, 2020.
- graph2vec: Learning distributed representations of graphs. arXiv:1707.05005, 2017.
- Mark EJ Newman. Spectral methods for community detection and graph partitioning. Physical Review E, 88(4):042822, 2013.
- Representation learning with contrastive predictive coding. arXiv preprint arXiv:1807.03748, 2018.
- Weisfeiler-lehman graph kernels. Journal of Machine Learning Research, 12(9), 2011.
- Infograph: Unsupervised and semi-supervised graph-level representation learning via mutual information maximization. In ICLR, 2020.
- Adversarial graph augmentation to improve graph contrastive learning. In NeurIPS, pages 15920–15933, 2021.
- Graph mining applications to social network analysis. Managing and Mining Graph Data, pages 487–513, 2010.
- Graph attention networks. In ICLR, 2018.
- Deep graph infomax. In ICLR, 2019.
- Community preserving network embedding. In AAAI, page 203–209. AAAI Press, 2017.
- Semi-supervised classification with graph convolutional networks. In ICLR, 2017.
- Simgrace: A simple framework for graph contrastive learning without data augmentation. In WWW, pages 1070–1079, 2022.
- How powerful are graph neural networks? In ICLR, 2019.
- Deep graph kernels. In KDD, pages 1365–1374, 2015.
- Graph contrastive learning with augmentations. In NeurIPS, pages 5812–5823, 2020.
- Graph contrastive learning automated. In ICML, pages 12121–12132, 2021.
- Da Zhang and Mansur R. Kabuka. Marml: Motif-aware deep representation learning in multilayer networks. IEEE Transactions on Neural Networks and Learning Systems, pages 1–12, 2023.
- Circuit-gnn: Graph neural networks for distributed circuit design. In ICML, pages 7364–7373, 2019.
- Variational graph neural networks for road traffic prediction in intelligent transportation systems. IEEE Transactions on Industrial Informatics, 17(4):2802–2812, 2020.