Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Isomorphic-Consistent Variational Graph Auto-Encoders for Multi-Level Graph Representation Learning (2312.05519v1)

Published 9 Dec 2023 in cs.LG

Abstract: Graph representation learning is a fundamental research theme and can be generalized to benefit multiple downstream tasks from the node and link levels to the higher graph level. In practice, it is desirable to develop task-agnostic general graph representation learning methods that are typically trained in an unsupervised manner. Related research reveals that the power of graph representation learning methods depends on whether they can differentiate distinct graph structures as different embeddings and map isomorphic graphs to consistent embeddings (i.e., the isomorphic consistency of graph models). However, for task-agnostic general graph representation learning, existing unsupervised graph models, represented by the variational graph auto-encoders (VGAEs), can only keep the isomorphic consistency within the subgraphs of 1-hop neighborhoods and thus usually manifest inferior performance on the more difficult higher-level tasks. To overcome the limitations of existing unsupervised methods, in this paper, we propose the Isomorphic-Consistent VGAE (IsoC-VGAE) for multi-level task-agnostic graph representation learning. We first devise a decoding scheme to provide a theoretical guarantee of keeping the isomorphic consistency under the settings of unsupervised learning. We then propose the Inverse Graph Neural Network (Inv-GNN) decoder as its intuitive realization, which trains the model via reconstructing the GNN node embeddings with multi-hop neighborhood information, so as to maintain the high-order isomorphic consistency within the VGAE framework. We conduct extensive experiments on the representative graph learning tasks at different levels, including node classification, link prediction and graph classification, and the results verify that our proposed model generally outperforms both the state-of-the-art unsupervised methods and representative supervised methods.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (41)
  1. Netgan: Generating graphs via random walks. In International Conference on Machine Learning, 610–619.
  2. On Graph Neural Networks versus Graph-Augmented MLPs. In International Conference on Learning Representations.
  3. Can Graph Neural Networks Count Substructures? In Advances in Neural Information Processing Systems, volume 33, 10383–10395.
  4. On the equivalence between graph isomorphism testing and function approximation with gnns. Advances in neural information processing systems, 32.
  5. Generalization and representational limits of graph neural networks. In International Conference on Machine Learning, 3419–3430.
  6. The graph isomorphism problem. Communications of the ACM, 63(11): 128–134.
  7. Graphite: Iterative generative modeling of graphs. In International Conference on Machine Learning, 2434–2444.
  8. Inductive Representation Learning on Large Graphs. In Advances in Neural Information Processing Systems, volume 30.
  9. Graphmae: Self-supervised masked graph autoencoders. In ACM SIGKDD Conference on Knowledge Discovery and Data Mining, 594–604.
  10. Universal Invariant and Equivariant Graph Neural Networks. In Advances in Neural Information Processing Systems, volume 32.
  11. Auto-encoding variational Bayes. In International Conference on Learning Representations.
  12. Variational graph auto-encoders. In NIPS Workshop on Bayesian Deep Learning.
  13. Semi-supervised classification with graph convolutional networks. In International Conference on Learning Representations.
  14. A reduction of a graph to a canonical form and an algebra arising during this reduction. Nauchno-Technicheskaya Informatsiya, 2(9): 12–16.
  15. What’s Behind the Mask: Understanding Masked Graph Modeling for Graph Autoencoders. In ACM SIGKDD International Conference on Knowledge Discovery and Data Mining.
  16. Distance Encoding: Design Provably More Powerful Neural Networks for Graph Representation Learning. In Advances in Neural Information Processing Systems, volume 33, 4465–4478.
  17. Loukas, A. 2020. What graph neural networks cannot learn: Depth vs Width. In International Conference on Learning Representations.
  18. Provably Powerful Graph Networks. In Advances in Neural Information Processing Systems, volume 32.
  19. On the universality of invariant networks. In International Conference on Machine Learning, 4363–4371.
  20. Stochastic blockmodels meet graph neural networks. In International Conference on Machine Learning, 4466–4474.
  21. Weisfeiler and leman go neural: Higher-order graph neural networks. In AAAI Conference on Artificial Intelligence, volume 33, 4602–4609.
  22. Adversarially regularized graph autoencoder for graph embedding. In IJCAI International Joint Conference on Artificial Intelligence.
  23. Graph representation learning via ladder gamma variational autoencoders. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 34, 5604–5611.
  24. Approximation Ratios of Graph Neural Networks for Combinatorial Problems. In Advances in Neural Information Processing Systems, volume 32.
  25. Random features strengthen graph neural networks. In SIAM International Conference on Data Mining, 333–341.
  26. Collective classification in network data. AI Magazine, 29(3): 93–93.
  27. InfoGraph: Unsupervised and Semi-supervised Graph-Level Representation Learning via Mutual Information Maximization. In International Conference on Learning Representations.
  28. Graph Auto-Encoder via Neighborhood Wasserstein Reconstruction. In International Conference on Learning Representations.
  29. Graph attention networks. In International Conference on Learning Representations.
  30. Deep Graph Infomax. In International Conference on Learning Representations.
  31. Graphgan: Graph representation learning with generative adversarial nets. In AAAI Conference on Artificial Intelligence, volume 32.
  32. Microsoft academic graph: When experts are not enough. Quantitative Science Studies, 1(1): 396–413.
  33. How Powerful are Spectral Graph Neural Networks. In International Conference on Machine Learning, volume 162, 23341–23362.
  34. Permutation-Invariant Variational Autoencoder for Graph-Level Representation Learning. In Advances in Neural Information Processing Systems, volume 34, 9559–9573.
  35. How Powerful are Graph Neural Networks? In International Conference on Learning Representations.
  36. Deep Graph Kernels. In ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, 1365–1374.
  37. Conditional Structure Generation through Graph Variational Generative Adversarial Nets. In Advances in Neural Information Processing Systems, volume 32.
  38. GraphSAINT: Graph Sampling Based Inductive Learning Method. In International Conference on Learning Representations.
  39. Link Prediction Based on Graph Neural Networks. In Advances in Neural Information Processing Systems, volume 31.
  40. Labeling Trick: A Theory of Using Graph Neural Networks for Multi-Node Representation Learning. In Advances in Neural Information Processing Systems, volume 34, 9061–9073.
  41. Adversarial directed graph embedding. In AAAI Conference on Artificial Intelligence, volume 35, 4741–4748.
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Hanxuan Yang (6 papers)
  2. Qingchao Kong (7 papers)
  3. Wenji Mao (13 papers)

Summary

We haven't generated a summary for this paper yet.