Papers
Topics
Authors
Recent
Search
2000 character limit reached

Exploring Correlations of Self-Supervised Tasks for Graphs

Published 7 May 2024 in cs.LG and cs.AI | (2405.04245v2)

Abstract: Graph self-supervised learning has sparked a research surge in training informative representations without accessing any labeled data. However, our understanding of graph self-supervised learning remains limited, and the inherent relationships between various self-supervised tasks are still unexplored. Our paper aims to provide a fresh understanding of graph self-supervised learning based on task correlations. Specifically, we evaluate the performance of the representations trained by one specific task on other tasks and define correlation values to quantify task correlations. Through this process, we unveil the task correlations between various self-supervised tasks and can measure their expressive capabilities, which are closely related to downstream performance. By analyzing the correlation values between tasks across various datasets, we reveal the complexity of task correlations and the limitations of existing multi-task learning methods. To obtain more capable representations, we propose Graph Task Correlation Modeling (GraphTCM) to illustrate the task correlations and utilize it to enhance graph self-supervised training. The experimental results indicate that our method significantly outperforms existing methods across various downstream tasks.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (50)
  1. Dropmessage: Unifying random dropping for graph neural networks. In AAAI Conference on Artificial Intelligence, 2022.
  2. Graph random neural networks for semi-supervised learning on graphs. NeurIPS, 33, 2020.
  3. Inductive representation learning on large graphs. In NeurIPS, pp.  1024–1034, 2017.
  4. Adaptive transfer learning on graph neural networks. Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery & Data Mining, 2021.
  5. Semi-implicit graph variational auto-encoders. ArXiv, abs/1908.07078, 2019.
  6. Learning deep representations by mutual information estimation and maximization. ICLR, 2019.
  7. Graphmae: Self-supervised masked graph autoencoders. Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, 2022a.
  8. Graphmae: Self-supervised masked graph autoencoders. In Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp.  594–604, 2022b.
  9. Open graph benchmark: Datasets for machine learning on graphs. arXiv preprint arXiv:2005.00687, 2020a.
  10. Strategies for pre-training graph neural networks. ICLR, 2020b.
  11. Pre-training graph neural networks for generic structural feature extraction. ArXiv, abs/1905.13728, 2019.
  12. Gpt-gnn: Generative pre-training of graph neural networks. Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, 2020c.
  13. Semi-supervised learning with graph learning-convolutional networks. 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), pp.  11305–11312, 2019.
  14. Sub-graph contrast for scalable self-supervised graph representation learning. In 2020 IEEE international conference on data mining (ICDM), pp.  222–231. IEEE, 2020.
  15. Self-supervised learning on graphs: Deep insights and new direction. ArXiv, abs/2006.10141, 2020.
  16. Automated self-supervised learning for graphs. ICLR, 2022.
  17. Towards robust graph contrastive learning. ArXiv, abs/2102.13085, 2021.
  18. Multi-task self-supervised graph neural networks enable stronger task generalization. ICLR, 2023.
  19. How to find your friendly neighborhood: Graph attention design with self-supervision. ArXiv, abs/2204.04879, 2022.
  20. Variational graph auto-encoders. ArXiv, abs/1611.07308, 2016a.
  21. Variational graph auto-encoders. In ArXiv, volume abs/1611.07308, 2016b.
  22. Semi-supervised classification with graph convolutional networks. In International Conference on Learning Representations, 2017.
  23. Seegera: Self-supervised semi-implicit graph variational auto-encoders with masking. In Proceedings of the ACM Web Conference 2023, pp.  143–153, 2023.
  24. Graph self-supervised learning: A survey. IEEE Transactions on Knowledge and Data Engineering, 35:5879–5900, 2021.
  25. Graph-based neural network models with multiple self-supervised auxiliary tasks. Pattern Recognit. Lett., 148:15–21, 2020.
  26. Adversarially regularized graph autoencoder. ArXiv, abs/1802.04407, 2018.
  27. Symmetric graph convolutional autoencoder for unsupervised graph representation learning. 2019 IEEE/CVF International Conference on Computer Vision (ICCV), pp.  6518–6527, 2019.
  28. Self-supervised graph representation learning via global context prediction. ArXiv, abs/2003.01604, 2020.
  29. Gcc: Graph contrastive coding for graph neural network pre-training. Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, 2020.
  30. A distributed approach to node clustering in decentralized peer-to-peer networks. In TPDS, volume 16, pp.  814–829, 2005.
  31. Dropedge: Towards deep graph convolutional networks on node classification. In ICLR, 2019.
  32. Collective classification in network data. volume 29, pp.  93–93, 2008.
  33. Pitfalls of graph neural network evaluation. ArXiv, abs/1811.05868, 2018.
  34. Multi-stage self-supervised learning for graph convolutional networks. ArXiv, abs/1902.11038, 2019.
  35. Deep graph infomax. ICLR, 2019a.
  36. Deep graph infomax. ICLR, 2019b.
  37. Mgae: Marginalized graph autoencoder for graph clustering. Proceedings of the 2017 ACM on Conference on Information and Knowledge Management, 2017.
  38. Self-supervised learning on graphs: Contrastive, generative, or predictive. IEEE Transactions on Knowledge and Data Engineering, 35:4216–4235, 2021.
  39. A comprehensive survey on graph neural networks. IEEE Transactions on Neural Networks and Learning Systems, 32:4–24, 2019.
  40. Self-supervised learning of graph neural networks: A unified review. IEEE Transactions on Pattern Analysis and Machine Intelligence, 45:2412–2429, 2021.
  41. How powerful are graph neural networks? ICLR, 2019.
  42. Ccgl: Contrastive cascade graph learning. IEEE Transactions on Knowledge and Data Engineering, 35(5):4539–4554, 2022.
  43. Graph convolutional neural networks for web-scale recommender systems. In SIGKDD, pp.  974–983, 2018.
  44. Graph contrastive learning with augmentations. NeurIPS, 2020a.
  45. When does self-supervision help graph convolutional networks? Proceedings of machine learning research, 119:10871–10880, 2020b.
  46. Taskonomy: Disentangling task transfer learning. 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp.  3712–3722, 2018.
  47. Protein representation learning by geometric structure pretraining. ICLR, 2023.
  48. Deep graph contrastive representation learning. ArXiv, abs/2006.04131, 2020a.
  49. Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. ArXiv, abs/2009.01674, 2020b.
  50. Prioritizing network communities. Nature Communications, 9, 2018.
Citations (4)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 2 tweets with 1 like about this paper.