Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
156 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

P-tensors: a General Formalism for Constructing Higher Order Message Passing Networks (2306.10767v1)

Published 19 Jun 2023 in stat.ML, cs.LG, math.ST, and stat.TH

Abstract: Several papers have recently shown that higher order graph neural networks can achieve better accuracy than their standard message passing counterparts, especially on highly structured graphs such as molecules. These models typically work by considering higher order representations of subgraphs contained within a given graph and then perform some linear maps between them. We formalize these structures as permutation equivariant tensors, or P-tensors, and derive a basis for all linear maps between arbitrary order equivariant P-tensors. Experimentally, we demonstrate this paradigm achieves state of the art performance on several benchmark datasets.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (38)
  1. Watch your step: Learning node embeddings via graph attention. Advances in neural information processing systems, 31, 2018.
  2. Diffwire: Inductive graph rewiring via the lovász bound. In The First Learning on Graphs Conference, 2022.
  3. Equivariant subgraph aggregation networks. arXiv preprint arXiv:2110.02910, 2021.
  4. Residual gated graph convnets. arXiv preprint arXiv:1711.07553, 2017.
  5. Graph convolutions that can finally model local structure. arXiv preprint arXiv:2011.15069, 2020.
  6. Hyperbolic graph convolutional neural networks. Advances in neural information processing systems, 32, 2019.
  7. Can graph neural networks count substructures? Advances in neural information processing systems, 33:10383–10395, 2020.
  8. Reconstruction for powerful graph representations. Advances in Neural Information Processing Systems, 34:1713–1726, 2021.
  9. Convolutional neural networks on graphs with fast localized spectral filtering. In Proceedings of the 30th International Conference on Neural Information Processing Systems, NIPS’16, page 3844–3852. Curran Associates Inc., 2016.
  10. Benchmarking graph neural networks. CoRR, abs/2003.00982, 2020.
  11. Hierarchical inter-message passing for learning on molecular graphs. arXiv preprint arXiv:2006.12179, 2020.
  12. Understanding and extending subgraph gnns by rethinking their symmetries. Advances in Neural Information Processing Systems, 35:31376–31390, 2022.
  13. Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123, 2020.
  14. Neural message passing for quantum chemistry. In International conference on machine learning, pages 1263–1272. PMLR, 2017.
  15. Graph star net for generalized multi-task learning. arXiv preprint arXiv:1906.12330, 2019.
  16. Deep convolutional networks on graph-structured data. arXiv preprint arXiv:1506.05163, 06 2015.
  17. Open graph benchmark: Datasets for machine learning on graphs. Advances in neural information processing systems, 33:22118–22133, 2020.
  18. Semi-supervised classification with graph convolutional networks. In International Conference on Learning Representations, 2017.
  19. Deepergcn: All you need to train deeper gcns. arXiv preprint arXiv:2006.07739, 2020.
  20. Non-local graph neural networks. IEEE transactions on pattern analysis and machine intelligence, 44(12):10270–10276, 2021.
  21. Link-based classification. In Encyclopedia of Machine Learning and Data Mining, 2003.
  22. Provably powerful graph networks. Advances in neural information processing systems, 32, 2019.
  23. Invariant and equivariant graph networks. arXiv preprint arXiv:1812.09902, 2018.
  24. On the universality of invariant networks. In International conference on machine learning, pages 4363–4371. PMLR, 2019.
  25. Weisfeiler and leman go neural: Higher-order graph neural networks. In Proceedings of the AAAI conference on artificial intelligence, pages 4602–4609, 2019.
  26. A capsule network-based model for learning node embeddings. In Proceedings of the 29th ACM International Conference on Information & Knowledge Management, pages 3313–3316, 2020.
  27. Geom-gcn: Geometric graph convolutional networks. arXiv preprint arXiv:2002.05287, 2020.
  28. Geom-gcn: Geometric graph convolutional networks. In International Conference on Learning Representations, 2020.
  29. Allen J Schwenk. Almost all trees are cospectral. New directions in the theory of graphs, pages 275–307, 1973.
  30. Understanding over-squashing and bottlenecks on graphs via curvature. arXiv preprint arXiv:2111.14522, 2021.
  31. Phi Vu Tran. Learning to make predictions on graphs with autoencoders. In 2018 IEEE 5th international conference on data science and advanced analytics (DSAA), pages 237–245. IEEE, 2018.
  32. The reduction of a graph to canonical form and the algebra which appears therein. NTI, Series, pages 12–16, 1968.
  33. How powerful are graph neural networks? arXiv preprint arXiv:1810.00826, 2018.
  34. Revisiting semi-supervised learning with graph embeddings. In Proceedings of the 33rd International Conference on International Conference on Machine Learning - Volume 48, ICML’16, page 40–48. JMLR.org, 2016.
  35. Deep sets. Advances in neural information processing systems, 30, 2017.
  36. Mutual teaching for graph convolutional networks. Future Generation Computer Systems, 115:837–843, 2021.
  37. From stars to subgraphs: Uplifting any gnn with local structure awareness. arXiv preprint arXiv:2110.03753, 2021.
  38. Deeper-gxx: deepening arbitrary gnns. arXiv preprint arXiv:2110.13798, 2021.
Citations (2)

Summary

We haven't generated a summary for this paper yet.