Graph Contrastive Learning Meets Graph Meta Learning: A Unified Method for Few-shot Node Tasks (2309.10376v1)
Abstract: Graph Neural Networks (GNNs) have become popular in Graph Representation Learning (GRL). One fundamental application is few-shot node classification. Most existing methods follow the meta learning paradigm, showing the ability of fast generalization to few-shot tasks. However, recent works indicate that graph contrastive learning combined with fine-tuning can significantly outperform meta learning methods. Despite the empirical success, there is limited understanding of the reasons behind it. In our study, we first identify two crucial advantages of contrastive learning compared to meta learning, including (1) the comprehensive utilization of graph nodes and (2) the power of graph augmentations. To integrate the strength of both contrastive learning and meta learning on the few-shot node classification tasks, we introduce a new paradigm: Contrastive Few-Shot Node Classification (COLA). Specifically, COLA employs graph augmentations to identify semantically similar nodes, which enables the construction of meta-tasks without the need for label information. Therefore, COLA can utilize all nodes to construct meta-tasks, further reducing the risk of overfitting. Through extensive experiments, we validate the essentiality of each component in our design and demonstrate that COLA achieves new state-of-the-art on all tasks.
- Inductive representation learning on large graphs. Advances in neural information processing systems, 30, 2017.
- Semi-supervised classification with graph convolutional networks. arXiv preprint arXiv:1609.02907, 2016.
- Graph attention networks. arXiv preprint arXiv:1710.10903, 2017.
- A comprehensive survey on graph neural networks. IEEE transactions on neural networks and learning systems, 32(1):4–24, 2020.
- Graph prototypical networks for few-shot learning on attributed networks. In Proceedings of the 29th ACM International Conference on Information & Knowledge Management, pages 295–304, 2020.
- Task-adaptive few-shot node classification. In Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pages 1910–1919, 2022.
- Graph meta learning via local subgraphs. Advances in Neural Information Processing Systems, 33:5862–5874, 2020.
- Distilling meta knowledge on heterogeneous graph for illicit drug trafficker detection on social media. Advances in Neural Information Processing Systems, 34:26911–26923, 2021.
- Node classification on graphs with few-shot novel labels via meta transformed network embedding. Advances in Neural Information Processing Systems, 33:16520–16531, 2020.
- Meta-gnn: On few-shot node classification in graph meta-learning. In Proceedings of the 28th ACM International Conference on Information and Knowledge Management, pages 2357–2360, 2019.
- Relative and absolute location embedding for few-shot node classification on graph. In Proceedings of the AAAI conference on artificial intelligence, volume 35, pages 4267–4275, 2021a.
- Model-agnostic meta-learning for fast adaptation of deep networks. In International conference on machine learning, pages 1126–1135. PMLR, 2017.
- Prototypical networks for few-shot learning. Advances in neural information processing systems, 30, 2017.
- Improved baselines with momentum contrastive learning. arXiv preprint arXiv:2003.04297, 2020a.
- A simple framework for contrastive learning of visual representations. In International conference on machine learning, pages 1597–1607. PMLR, 2020b.
- Bootstrap your own latent-a new approach to self-supervised learning. Advances in neural information processing systems, 33:21271–21284, 2020.
- Rethinking few-shot image classification: a good embedding is all you need? In Computer Vision–ECCV 2020: 16th European Conference, Glasgow, UK, August 23–28, 2020, Proceedings, Part XIV 16, pages 266–282. Springer, 2020.
- Transductive linear probing: A novel framework for few-shot node classification. arXiv preprint arXiv:2212.05606, 2022.
- Simple unsupervised graph representation learning. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 36, pages 7797–7805, 2022.
- Multi-scale contrastive siamese networks for self-supervised graph representation learning. arXiv preprint arXiv:2105.05682, 2021.
- Supervised contrastive learning. Advances in neural information processing systems, 33:18661–18673, 2020.
- Revisiting semi-supervised learning with graph embeddings. In International conference on machine learning, pages 40–48. PMLR, 2016.
- Deep graph contrastive representation learning. arXiv preprint arXiv:2006.04131, 2020.
- Graph contrastive learning with augmentations. Advances in neural information processing systems, 33:5812–5823, 2020.
- Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 34, pages 5892–5899, 2020.
- Graph few-shot learning via knowledge transfer. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 34, pages 6656–6663, 2020.
- Deep graph infomax. ICLR (Poster), 2(3):4, 2019.
- Contrastive multi-view representation learning on graphs. In International conference on machine learning, pages 4116–4126. PMLR, 2020.
- Representation learning with contrastive predictive coding. arXiv preprint arXiv:1807.03748, 2018.
- Bootstrapped representation learning on graphs. In ICLR 2021 Workshop on Geometrical and Topological Representation Learning, 2021.
- Boosting few-shot visual learning with self-supervision. In Proceedings of the IEEE/CVF international conference on computer vision, pages 8059–8068, 2019.
- When does self-supervision improve few-shot learning? In Computer Vision–ECCV 2020: 16th European Conference, Glasgow, UK, August 23–28, 2020, Proceedings, Part VII 16, pages 645–666. Springer, 2020.
- Pareto self-supervised training for few-shot learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 13663–13672, 2021.
- Learning a few-shot embedding model with contrastive learning. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 35, pages 8635–8643, 2021b.
- Contrastive prototype learning with augmented embeddings for few-shot learning. In Uncertainty in Artificial Intelligence, pages 140–150. PMLR, 2021.
- Unsupervised meta-learning via few-shot pseudo-supervised contrastive learning. arXiv preprint arXiv:2303.00996, 2023.
- The close relationship between contrastive learning and meta-learning. In International Conference on Learning Representations, 2021.
- Pitfalls of graph neural network evaluation. arXiv preprint arXiv:1811.05868, 2018.
- Deep gaussian embedding of graphs: Unsupervised inductive learning via ranking. arXiv preprint arXiv:1707.03815, 2017.
- Open graph benchmark: Datasets for machine learning on graphs. Advances in neural information processing systems, 33:22118–22133, 2020.
- Hao Liu (497 papers)
- Jiarui Feng (11 papers)
- Lecheng Kong (10 papers)
- Dacheng Tao (826 papers)
- Yixin Chen (126 papers)
- Muhan Zhang (89 papers)