Technical report: Graph Neural Networks go Grammatical (2303.01590v4)
Abstract: This paper introduces a framework for formally establishing a connection between a portion of an algebraic language and a Graph Neural Network (GNN). The framework leverages Context-Free Grammars (CFG) to organize algebraic operations into generative rules that can be translated into a GNN layer model. As CFGs derived directly from a language tend to contain redundancies in their rules and variables, we present a grammar reduction scheme. By applying this strategy, we define a CFG that conforms to the third-order Weisfeiler-Lehman (3-WL) test using MATLANG. From this 3-WL CFG, we derive a GNN model, named G$2$N$2$, which is provably 3-WL compliant. Through various experiments, we demonstrate the superior efficiency of G$2$N$2$ compared to other 3-WL GNNs across numerous downstream tasks. Specifically, one experiment highlights the benefits of grammar reduction within our framework.
- AA Lehman and Boris Weisfeiler. A reduction of a graph to a canonical form and an algebra arising during this reduction. Nauchno-Technicheskaya Informatsiya, 2(9):12–16, 1968.
- Weisfeiler and leman go neural: Higher-order graph neural networks. In Proceedings of the AAAI conference on artificial intelligence, volume 33, pages 4602–4609, 2019.
- Weisfeiler and lehman go topological: Message passing simplicial networks. In International Conference on Machine Learning, pages 1026–1037. PMLR, 2021.
- Weisfeiler and lehman go cellular: Cw networks. Advances in Neural Information Processing Systems, 34:2625–2640, 2021.
- A complete expressiveness hierarchy for subgraph gnns via subgraph weisfeiler-lehman tests. In International Conference on Machine Learning, 2023.
- Neural message passing for quantum chemistry. In International conference on machine learning, pages 1263–1272. PMLR, 2017.
- A comprehensive survey on graph neural networks. IEEE transactions on neural networks and learning systems, 32(1):4–24, 2020.
- How powerful are graph neural networks? In International Conference on Learning Representations, 2019.
- Can graph neural networks count substructures? Advances in neural information processing systems, 33:10383–10395, 2020.
- Nested graph neural networks. Advances in Neural Information Processing Systems, 34:15734–15747, 2021.
- From stars to subgraphs: Uplifting any GNN with local structure awareness. In International Conference on Learning Representations, 2022.
- Understanding and extending subgraph gnns by rethinking their symmetries. In Advances in Neural Information Processing Systems, 2022.
- Provably powerful graph networks. Advances in neural information processing systems, 32, 2019.
- Invariant and equivariant graph networks. In International Conference on Learning Representations, 2019.
- Distance encoding: Design provably more powerful neural networks for graph representation learning. In H. Larochelle, M. Ranzato, R. Hadsell, M.F. Balcan, and H. Lin, editors, Advances in Neural Information Processing Systems, volume 33, pages 4465–4478. Curran Associates, Inc., 2020.
- A short tutorial on the weisfeiler-lehman test and its variants. In ICASSP 2021-2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pages 8533–8537. IEEE, 2021.
- The expressive power of graph neural networks: A survey. arXiv preprint arXiv:2308.08235, 2023.
- FlorisF Geerts. On the expressive power of linear algebra on graphs. Theory of Computing Systems, Oct 2020.
- On the expressive power of query languages for matrices. ACM Trans. Database Syst., 44(4):15:1–15:31, 2019.
- Analyzing the expressive power of graph neural networks in a spectral perspective. In International Conference on Learning Representations, 2020.
- Breaking the limits of message passing graph neural networks. In International Conference on Machine Learning, pages 599–608. PMLR, 2021.
- Quantum chemistry structures and properties of 134 kilo molecules. Scientific data, 1(1):1–7, 2014.
- Moleculenet: a benchmark for molecular machine learning. Chemical science, 9(2):513–530, 2018.
- Boosting the cycle counting power of graph neural networks with I22{}^{2}start_FLOATSUPERSCRIPT 2 end_FLOATSUPERSCRIPT-GNNs. In The Eleventh International Conference on Learning Representations, 2023.
- Tudataset: A collection of benchmark datasets for learning with graphs. In ICML 2020 Workshop on Graph Representation Learning and Beyond (GRL+ 2020), 2020.
- Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence, 45(1):657–668, 2022.
- Wavelets on graphs via spectral graph theory. Applied and Computational Harmonic Analysis, 30(2):129–150, 2011.
- Expressiveness and approximation properties of graph neural networks. In International Conference on Learning Representations, 2022.
- Semi-supervised classification with graph convolutional networks. In 5th International Conference on Learning Representations, 2017.
- Weisfeiler-lehman graph kernels. Journal of Machine Learning Research, 12(9), 2011.
- Graph neural tangent kernel: Fusing graph neural networks with graph kernels. Advances in neural information processing systems, 32, 2019.
- An end-to-end deep learning architecture for graph classification. In Proceedings of the AAAI conference on artificial intelligence, volume 32, 2018.
- Natural graph networks. Advances in Neural Information Processing Systems, 33:3636–3646, 2020.
- Wasserstein embedding for graph learning. arXiv preprint arXiv:2006.09430, 2020.
- Graphnorm: A principled approach to accelerating graph neural network training. In International Conference on Machine Learning, pages 1204–1215. PMLR, 2021.