Graph Neural Networks can Recover the Hidden Features Solely from the Graph Structure (2301.10956v4)
Abstract: Graph Neural Networks (GNNs) are popular models for graph learning problems. GNNs show strong empirical performance in many practical tasks. However, the theoretical properties have not been completely elucidated. In this paper, we investigate whether GNNs can exploit the graph structure from the perspective of the expressive power of GNNs. In our analysis, we consider graph generation processes that are controlled by hidden (or latent) node features, which contain all information about the graph structure. A typical example of this framework is kNN graphs constructed from the hidden features. In our main results, we show that GNNs can recover the hidden node features from the input graph alone, even when all node features, including the hidden features themselves and any indirect hints, are unavailable. GNNs can further use the recovered node features for downstream tasks. These results show that GNNs can fully exploit the graph structure by themselves, and in effect, GNNs can use both the hidden and explicit node features for downstream tasks. In the experiments, we confirm the validity of our results by showing that GNNs can accurately recover the hidden features using a GNN architecture built based on our theoretical analysis.
- The surprising power of graph neural networks with random node initialization. In Proceedings of the 30th International Joint Conference on Artificial Intelligence, IJCAI, pages 2112–2118, 2021.
- Morteza Alamgir and Ulrike von Luxburg. Shortest path distance in random k-nearest neighbor graphs. In Proceedings of the 29th International Conference on Machine Learning, ICML, 2012.
- Laplacian eigenmaps for dimensionality reduction and data representation. Neural Comput., 15(6):1373–1396, 2003.
- Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In Proceedings of the 34th AAAI Conference on Artificial Intelligence, AAAI, pages 3438–3445, 2020.
- FastGCN: Fast learning with graph convolutional networks via importance sampling. In Proceedings of the 6th International Conference on Learning Representations, ICLR, 2018.
- Simple and deep graph convolutional networks. In Proceedings of the 37th International Conference on Machine Learning, ICML, pages 1725–1735, 2020.
- Discovering symbolic models from deep learning with inductive biases. In Advances in Neural Information Processing Systems 33: Annual Conference on Neural Information Processing Systems 2020, NeurIPS, 2020.
- Convolutional neural networks on graphs with fast localized spectral filtering. In Advances in Neural Information Processing Systems 29: Annual Conference on Neural Information Processing Systems 2016, NeurIPS, pages 3837–3845, 2016.
- Understanding the representation power of graph neural networks in learning graph topology. In Advances in Neural Information Processing Systems 32: Annual Conference on Neural Information Processing Systems 2019, NeurIPS, pages 15387–15397, 2019.
- On random graphs i. Publicationes mathematicae, 6(1):290–297, 1959.
- A fair comparison of graph neural networks for graph classification. In Proceedings of the 8th International Conference on Learning Representations, ICLR, 2020.
- Graph neural networks for social recommendation. In The Web Conference 2019, WWW, pages 417–426, 2019.
- Generalization and representational limits of graph neural networks. In Proceedings of the 37th International Conference on Machine Learning, ICML, pages 3419–3430, 2020.
- Neural message passing for quantum chemistry. In Proceedings of the 34th International Conference on Machine Learning, ICML, pages 1263–1272, 2017.
- A new model for learning in graph domains. In Proceedings of the International Joint Conference on Neural Networks, IJCNN, volume 2, pages 729–734, 2005.
- William L Hamilton. Graph representation learning. Synthesis Lectures on Artifical Intelligence and Machine Learning, 14(3):1–159, 2020.
- Inductive representation learning on large graphs. In Advances in Neural Information Processing Systems 30: Annual Conference on Neural Information Processing Systems 2017, NeurIPS, pages 1024–1034, 2017.
- Vision GNN: an image is worth graph of nodes. In Advances in Neural Information Processing Systems 35: Annual Conference on Neural Information Processing Systems 2022, NeurIPS, 2022.
- GCN-MF: disease-gene association identification by graph convolutional networks and matrix factorization. In Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, KDD, pages 705–713, 2019.
- Metric recovery from directed unweighted graphs. In Proceedings of the 18th International Conference on Artificial Intelligence and Statistics, AISTATS, 2015.
- Lightgcn: Simplifying and powering graph convolution network for recommendation. In Proceedings of the 43rd International ACM SIGIR conference on research and development in Information Retrieval, SIGIR, pages 639–648, 2020.
- GPT-GNN: generative pre-training of graph neural networks. In Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, KDD, pages 1857–1867, 2020.
- The procrustes program: Producing direct rotation to test a hypothesized factor structure. Behavioral science, 7(2):258, 1962.
- Stefanie Jegelka. Theory of graph neural networks: Representation and learning. arXiv, abs/2204.07697, 2022.
- Adam: A method for stochastic optimization. In Proceedings of the 3rd International Conference on Learning Representations, ICLR, 2015.
- Auto-encoding variational bayes. In Proceedings of the 2nd International Conference on Learning Representations, ICLR, 2014.
- Semi-supervised classification with graph convolutional networks. In Proceedings of the 5th International Conference on Learning Representations, ICLR, 2017.
- Predict then propagate: Graph neural networks meet personalized pagerank. In Proceedings of the 7th International Conference on Learning Representations, ICLR, 2019.
- Deeper insights into graph convolutional networks for semi-supervised learning. In Proceedings of the 32nd AAAI Conference on Artificial Intelligence, AAAI, pages 3538–3545, 2018.
- Structure-aware interactive graph neural networks for the prediction of protein-ligand binding affinity. In Proceedings of the 27th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, KDD, pages 975–985, 2021.
- Andreas Loukas. What graph neural networks cannot learn: depth vs width. In Proceedings of the 8th International Conference on Learning Representations, ICLR, 2020.
- Provably powerful graph networks. In Advances in Neural Information Processing Systems 32: Annual Conference on Neural Information Processing Systems 2019, NeurIPS, pages 2153–2164, 2019.
- Invariant and equivariant graph networks. In Proceedings of the 7th International Conference on Learning Representations, ICLR, 2019.
- Weisfeiler and leman go neural: Higher-order graph neural networks. In Proceedings of the 33rd AAAI Conference on Artificial Intelligence, AAAI, pages 4602–4609, 2019.
- Relational pooling for graph representations. In Proceedings of the 36th International Conference on Machine Learning, ICML, pages 4663–4673, 2019.
- Revisiting graph neural networks: All we have is low-pass filters. arXiv, abs/1905.09550, 2019.
- Graph neural networks exponentially lose expressive power for node classification. In Proceedings of the 8th International Conference on Learning Representations, ICLR, 2020.
- Learning mesh-based simulation with graph networks. In Proceedings of the 9th International Conference on Learning Representations, ICLR, 2021.
- GCC: graph contrastive coding for graph neural network pre-training. In Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, KDD, pages 1150–1160, 2020.
- Ryoma Sato. A survey on the expressive power of graph neural networks. arXiv, abs/2003.04078, 2020.
- Ryoma Sato. Towards principled user-side recommender systems. In Proceedings of the 31st ACM International Conference on Information & Knowledge Management, CIKM, pages 1757–1766, 2022.
- Approximation ratios of graph neural networks for combinatorial problems. In Advances in Neural Information Processing Systems 32: Annual Conference on Neural Information Processing Systems 2019, NeurIPS, pages 4083–4092, 2019.
- Random features strengthen graph neural networks. In Proceedings of the 2021 SIAM International Conference on Data Mining, SDM, pages 333–341, 2021.
- Constant time graph neural networks. ACM Trans. Knowl. Discov. Data, 16(5):92:1–92:31, 2022.
- The graph neural network model. IEEE Trans. Neural Networks, 20(1):61–80, 2009.
- The vapnik-chervonenkis dimension of graph and recursive neural networks. Neural Networks, 108:248–259, 2018.
- Peter H Schönemann. A generalized solution of the orthogonal procrustes problem. Psychometrika, 31(1):1–10, 1966.
- Pitfalls of graph neural network evaluation. arXiv, 2018.
- Robin Sibson. Studies in the robustness of multidimensional scaling: Procrustes statistics. Journal of the Royal Statistical Society: Series B (Methodological), 40(2):234–238, 1978.
- Robin Sibson. Studies in the robustness of multidimensional scaling: Perturbational analysis of classical scaling. Journal of the Royal Statistical Society: Series B (Methodological), 41(2):217–229, 1979.
- Diffusion processes with boundary conditions. Communications on Pure and Applied Mathematics, 24(2):147–225, 1971.
- Consistent latent position estimation and vertex classification for random dot product graphs. IEEE Trans. Pattern Anal. Mach. Intell., 36(1):48–57, 2014.
- A global geometric framework for nonlinear dimensionality reduction. science, 290(5500):2319–2323, 2000.
- Yoshikazu Terada and Ulrike von Luxburg. Local ordinal embedding. In Proceedings of the 31st International Conference on Machine Learning, ICML, pages 847–855, 2014.
- Graph attention networks. In Proceedings of the 6th International Conference on Learning Representations, ICLR, 2018.
- Deep graph infomax. In Proceedings of the 7th International Conference on Learning Representations, ICLR, 2019.
- Ulrike von Luxburg and Morteza Alamgir. Density estimation from unweighted k-nearest neighbor graphs: a roadmap. In Advances in Neural Information Processing Systems 26: Annual Conference on Neural Information Processing Systems 2013, NeurIPS, pages 225–233, 2013.
- AM-GCN: adaptive multi-channel graph convolutional networks. In Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, KDD, pages 1243–1253, 2020.
- Traffic flow prediction via spatial temporal graph neural network. In The Web Conference 2020, WWW, pages 1082–1092, 2020.
- Simplifying graph convolutional networks. In Proceedings of the 36th International Conference on Machine Learning, ICML, pages 6861–6871, 2019.
- How powerful are graph neural networks? In Proceedings of the 7th International Conference on Learning Representations, ICLR, 2019.
- What can neural networks reason about? In Proceedings of the 8th International Conference on Learning Representations, ICLR, 2020.
- Revisiting semi-supervised learning with graph embeddings. In Proceedings of the 33rd International Conference on Machine Learning, ICML, pages 40–48, 2016.
- When does self-supervision help graph convolutional networks? In Proceedings of the 37th International Conference on Machine Learning, ICML, pages 10871–10880, 2020.
- Link prediction based on graph neural networks. In Advances in Neural Information Processing Systems 31: Annual Conference on Neural Information Processing Systems 2018, NeurIPS, pages 5171–5181, 2018.
- Layer-dependent importance sampling for training deep and large graph convolutional networks. In Advances in Neural Information Processing Systems 32: Annual Conference on Neural Information Processing Systems 2019, NeurIPS, pages 11247–11256, 2019.