2000 character limit reached
Improving Subgraph-GNNs via Edge-Level Ego-Network Encodings (2312.05905v2)
Published 10 Dec 2023 in cs.LG and cs.AI
Abstract: We present a novel edge-level ego-network encoding for learning on graphs that can boost Message Passing Graph Neural Networks (MP-GNNs) by providing additional node and edge features or extending message-passing formats. The proposed encoding is sufficient to distinguish Strongly Regular Graphs, a family of challenging 3-WL equivalent graphs. We show theoretically that such encoding is more expressive than node-based sub-graph MP-GNNs. In an empirical evaluation on four benchmarks with 10 graph datasets, our results match or improve previous baselines on expressivity, graph classification, graph regression, and proximity tasks -- while reducing memory usage by 18.1x in certain real-world settings.
- The surprising power of graph neural networks with random node initialization. In Proceedings of the Thirtieth International Joint Conference on Artificial Intelligence, IJCAI-21, pp. 2112–2118, 8 2021.
- Shortest path networks for graph property prediction. In Proceedings of the First Learning on Graphs Conference (LoG), 2022.
- On the bottleneck of graph neural networks and its practical implications. In International Conference on Learning Representations, 2021.
- Beyond 1-WL with local ego-network encodings. In The First Learning on Graphs Conference, 2022.
- On weisfeiler-leman invariance: Subgraph counts and related graph properties. Journal of Computer and System Sciences, 113:42–59, 2020.
- Canonical labelling of graphs in linear average time. In 20th Annual Symposium on Foundations of Computer Science (sfcs 1979), pp. 39–46, 1979. doi: 10.1109/SFCS.1979.8.
- Breaking the limits of message passing graph neural networks. In Proceedings of the 38th International Conference on Machine Learning (ICML), 2021.
- The logical expressiveness of graph neural networks. In International Conference on Learning Representations, 2020.
- Interaction networks for learning about objects, relations and physics. In Advances in Neural Information Processing Systems, volume 29, 2016.
- Equivariant subgraph aggregation networks. In International Conference on Learning Representations, 2022.
- Weisfeiler and Lehman go cellular: CW networks. In Advances in Neural Information Processing Systems, volume 34, 2021.
- Improving graph neural network expressivity via subgraph isomorphism counting, 2021.
- Strongly regular graphs, volume 182. Cambridge University Press, 2022. ISBN 9781316512036.
- Can graph neural networks count substructures? In Advances in Neural Information Processing Systems, volume 33, pp. 10383–10395, 2020.
- Principal neighbourhood aggregation for graph nets. In Proceedings of the 34th International Conference on Neural Information Processing Systems, NIPS’20, Red Hook, NY, USA, 2020. ISBN 9781713829546.
- Convolutional networks on graphs for learning molecular fingerprints. In Advances in Neural Information Processing Systems, volume 28, 2015.
- Benchmarking graph neural networks. arXiv preprint arXiv:2003.00982, 2020.
- Graph neural networks with learnable structural and positional representations. In International Conference on Learning Representations, 2022.
- Understanding and extending subgraph gnns by rethinking their symmetries. In Advances in Neural Information Processing Systems, 2022.
- Neural message passing for quantum chemistry. In Proceedings of the 34th International Conference on Machine Learning (ICML), volume 70, 2017.
- Martin Grohe. The logic of graph neural networks. In Proceedings of the 36th Annual ACM/IEEE Symposium on Logic in Computer Science, LICS ’21, New York, NY, USA, 2021. Association for Computing Machinery. ISBN 9781665448956. doi: 10.1109/LICS52264.2021.9470677.
- DRew: Dynamically rewired message passing with delay. In International Conference on Machine Learning, pp. 12252–12267. PMLR, 2023.
- Open graph benchmark: Datasets for machine learning on graphs. arXiv preprint arXiv:2005.00687, 2020a.
- Strategies for pre-training graph neural networks. In International Conference on Learning Representations, 2020b.
- Pure transformers are powerful graph learners. In Advances in Neural Information Processing Systems, 2022.
- Semi-supervised classification with graph convolutional networks. In 5th International Conference on Learning Representations, ICLR, 2017.
- Distance encoding: Design provably more powerful neural networks for graph representation learning. In Advances in Neural Information Processing Systems, volume 33, pp. 4465–4478, 2020.
- Provably powerful graph networks. In Advances in Neural Information Processing Systems, volume 32, 2019.
- Subgraph permutation equivariant networks. Transactions on Machine Learning Research, 2023. ISSN 2835-8856.
- Weisfeiler and leman go neural: Higher-order graph neural networks. Proceedings of the AAAI Conference on Artificial Intelligence, 33(01):4602–4609, Jul. 2019.
- Weisfeiler and Leman go machine learning: The story so far. arXiv preprint arXiv:2112.09992, 2021.
- k-hop graph neural networks. Neural Networks, 130:195–205, 2020.
- Graph neural networks exponentially lose expressive power for node classification. In International Conference on Learning Representations, 2022.
- A theoretical comparison of graph neural network extensions. In Kamalika Chaudhuri, Stefanie Jegelka, Le Song, Csaba Szepesvari, Gang Niu, and Sivan Sabato (eds.), Proceedings of the 39th International Conference on Machine Learning, volume 162 of Proceedings of Machine Learning Research, pp. 17323–17345. PMLR, 17–23 Jul 2022.
- DropGNN: random dropouts increase the expressiveness of graph neural networks. In 35th Conference on Neural Information Processing Systems, 2021.
- Recipe for a General, Powerful, Scalable Graph Transformer. Advances in Neural Information Processing Systems, 35, 2022.
- Building powerful and equivariant graph neural networks with structural message-passing. In H. Larochelle, M. Ranzato, R. Hadsell, M. F. Balcan, and H. Lin (eds.), Advances in Neural Information Processing Systems, volume 33, 2020.
- B Weisfeiler and A Leman. The reduction of a graph to canonical form and the algebra which appears therein. Nauchno-Technicheskaya Informatsia, 2(9):12–16, 1968.
- How powerful are graph neural networks? In International Conference on Learning Representations, 2019.
- Do transformers really perform badly for graph representation? In 35th Conference on Neural Information Processing Systems, 2021.
- Graph convolutional neural networks for web-scale recommender systems. In Proceedings of the 24th ACM International Conference on Knowledge Discovery & Data Mining, pp. 974–983, 2018.
- Position-aware graph neural networks. In Proceedings of the 36th International Conference on Machine Learning, volume 97 of Proceedings of Machine Learning Research, pp. 7134–7143, Long Beach, California, USA, 09–15 Jun 2019. PMLR.
- Identity-aware graph neural networks. In 35th AAAI Conference on Artificial Intelligence, volume 35, pp. 10737–10745, 2021.
- Graph transformer networks. In Advances in Neural Information Processing Systems, volume 32, 2019.
- Rethinking the expressive power of GNNs via graph biconnectivity. In International Conference on Learning Representations, 2023.
- Nested graph neural networks. Advances in Neural Information Processing Systems, 34, 2021.
- From stars to subgraphs: Uplifting any GNN with local structure awareness. In International Conference on Learning Representations, 2022.