The logic of rational graph neural networks (2310.13139v8)
Abstract: The expressivity of Graph Neural Networks (GNNs) can be described via appropriate fragments of the first order logic. Any query of the two variable fragment of graded modal logic (GC2) interpreted over labeled graphs can be expressed using a Rectified Linear Unit (ReLU) GNN whose size does not grow with graph input sizes [Barcelo & Al., 2020]. Conversely, a GNN expresses at most a query of GC2, for any choice of activation function. In this article, we prove that some GC2 queries of depth $3$ cannot be expressed by GNNs with any rational activation function. This shows that not all non-polynomial activation functions confer GNNs maximal expressivity, answering a open question formulated by [Grohe, 2021]. This result is also in contrast with the efficient universal approximation properties of rational feedforward neural networks investigated by [Boull\'e & Al., 2020]. We also present a rational subfragment of the first order logic (RGC2), and prove that rational GNNs can express RGC2 queries uniformly over all graphs.
- Exponentially improving the complexity of simulating the weisfeiler-lehman test with graph neural networks. Advances in Neural Information Processing Systems, 35:27333–27346, 2022.
- Geometric deep learning: going beyond euclidean data. IEEE Signal Processing Magazine, 34(4):18–42, 2017.
- The logical expressiveness of graph neural networks. In 8th International Conference on Learning Representations (ICLR 2020), 2020.
- Interaction networks for learning about objects, relations and physics. Advances in neural information processing systems, 29, 2016.
- Combinatorial optimization and reasoning with graph neural networks. arXiv preprint arXiv:2102.09544, 2021.
- Convolutional neural networks on graphs with fast localized spectral filtering. Advances in neural information processing systems, 29, 2016.
- Convolutional networks on graphs for learning molecular fingerprints. Advances in neural information processing systems, 28, 2015.
- Cognitive graph for multi-hop reading comprehension at scale. arXiv preprint arXiv:1905.05460, 2019.
- Martin Grohe. The logic of graph neural networks. In 2021 36th Annual ACM/IEEE Symposium on Logic in Computer Science (LICS), pages 1–17. IEEE, 2021.
- Martin Grohe. The descriptive complexity of graph neural networks. arXiv preprint arXiv:2303.04613, 2023.
- William L Hamilton. Graph representation learning. Synthesis Lectures on Artifical Intelligence and Machine Learning, 14(3):1–159, 2020.
- On the power of graph neural networks and the role of the activation function. arXiv preprint arXiv:2307.04661, 2023.
- Learning combinatorial optimization algorithms over graphs. Advances in neural information processing systems, 30, 2017.
- Weisfeiler and leman go neural: Higher-order graph neural networks. In Proceedings of the AAAI conference on artificial intelligence, volume 33, pages 4602–4609, 2019.
- Learning to simulate complex physics with graph networks. In International conference on machine learning, pages 8459–8468. PMLR, 2020.
- The graph neural network model. IEEE transactions on neural networks, 20(1):61–80, 2008.
- A deep learning approach to antibiotic discovery. Cell, 180(4):688–702, 2020.
- How powerful are graph neural networks? arXiv preprint arXiv:1810.00826, 2018.
- Modeling polypharmacy side effects with graph convolutional networks. Bioinformatics, 34(13):i457–i466, 2018.
- Graph neural networks: A review of methods and applications. AI open, 1:57–81, 2020.