Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
126 tokens/sec
GPT-4o
47 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Weisfeiler and Lehman Go Paths: Learning Topological Features via Path Complexes (2308.06838v6)

Published 13 Aug 2023 in cs.LG

Abstract: Graph Neural Networks (GNNs), despite achieving remarkable performance across different tasks, are theoretically bounded by the 1-Weisfeiler-Lehman test, resulting in limitations in terms of graph expressivity. Even though prior works on topological higher-order GNNs overcome that boundary, these models often depend on assumptions about sub-structures of graphs. Specifically, topological GNNs leverage the prevalence of cliques, cycles, and rings to enhance the message-passing procedure. Our study presents a novel perspective by focusing on simple paths within graphs during the topological message-passing process, thus liberating the model from restrictive inductive biases. We prove that by lifting graphs to path complexes, our model can generalize the existing works on topology while inheriting several theoretical results on simplicial complexes and regular cell complexes. Without making prior assumptions about graph sub-structures, our method outperforms earlier works in other topological domains and achieves state-of-the-art results on various benchmarks.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (65)
  1. On the Bottleneck of Graph Neural Networks and its Practical Implications. In International Conference on Learning Representations.
  2. Diffusion-Convolutional Neural Networks. In Proceedings of the 30th International Conference on Neural Information Processing Systems, NIPS’16, 2001–2009. Red Hook, NY, USA: Curran Associates Inc. ISBN 9781510838819.
  3. Directional Graph Networks. In International Conference on Machine Learning.
  4. Biewald, L. 2020. Experiment Tracking with Weights and Biases. https://www.wandb.com/. Accessed: 2023-12-31.
  5. Weisfeiler and Lehman Go Cellular: CW Networks. In Ranzato, M.; Beygelzimer, A.; Dauphin, Y.; Liang, P.; and Vaughan, J. W., eds., Advances in Neural Information Processing Systems, volume 34, 2625–2640. Curran Associates, Inc.
  6. Weisfeiler and Lehman Go Topological: Message Passing Simplicial Networks. In Meila, M.; and Zhang, T., eds., Proceedings of the 38th International Conference on Machine Learning, volume 139 of Proceedings of Machine Learning Research, 1026–1037. PMLR.
  7. Improving Graph Neural Network Expressivity via Subgraph Isomorphism Counting. IEEE Transactions on Pattern Analysis and Machine Intelligence, 45(1): 657–668.
  8. Residual Gated Graph ConvNets. arXiv:1711.07553.
  9. Spectral Networks and Locally Connected Networks on Graphs. In Bengio, Y.; and LeCun, Y., eds., 2nd International Conference on Learning Representations, ICLR 2014, Banff, AB, Canada, April 14-16, 2014, Conference Track Proceedings.
  10. An optimal lower bound on the number of variables for graph identification. In 30th Annual Symposium on Foundations of Computer Science, 612–617.
  11. Can Graph Neural Networks Count Substructures? In Larochelle, H.; Ranzato, M.; Hadsell, R.; Balcan, M.; and Lin, H., eds., Advances in Neural Information Processing Systems, volume 33, 10383–10395. Curran Associates, Inc.
  12. Fast and Accurate Deep Network Learning by Exponential Linear Units (ELUs). In Bengio, Y.; and LeCun, Y., eds., 4th International Conference on Learning Representations, ICLR 2016, San Juan, Puerto Rico, May 2-4, 2016, Conference Track Proceedings.
  13. Principal Neighbourhood Aggregation for Graph Nets. In Advances in Neural Information Processing Systems.
  14. Natural Graph Networks. In Proceedings of the 34th International Conference on Neural Information Processing Systems, NIPS’20. Red Hook, NY, USA: Curran Associates Inc. ISBN 9781713829546.
  15. Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering. In Lee, D.; Sugiyama, M.; Luxburg, U.; Guyon, I.; and Garnett, R., eds., Advances in Neural Information Processing Systems, volume 29. Curran Associates, Inc.
  16. Benchmarking Graph Neural Networks. Journal of Machine Learning Research, 24(43): 1–48.
  17. Simplicial Neural Networks. In TDA & Beyond.
  18. pathGCN: Learning General Graph Spatial Operators from Paths. In Chaudhuri, K.; Jegelka, S.; Song, L.; Szepesvari, C.; Niu, G.; and Sabato, S., eds., Proceedings of the 39th International Conference on Machine Learning, volume 162 of Proceedings of Machine Learning Research, 5878–5891. PMLR.
  19. Hypergraph Neural Networks. AAAI 2019.
  20. Fast Graph Representation Learning with PyTorch Geometric. In ICLR Workshop on Representation Learning on Graphs and Manifolds.
  21. Hierarchical Inter-Message Passing for Learning on Molecular Graphs. In ICML Graph Representation Learning and Beyond (GRL+) Workhop.
  22. On Graph Kernels: Hardness Results and Efficient Alternatives. In Schölkopf, B.; and Warmuth, M. K., eds., Learning Theory and Kernel Machines, 129–143. Berlin, Heidelberg: Springer Berlin Heidelberg. ISBN 978-3-540-45167-9.
  23. Neural Message Passing for Quantum Chemistry. In Proceedings of the 34th International Conference on Machine Learning - Volume 70, ICML’17, 1263–1272. JMLR.org.
  24. Cell Attention Networks. arXiv:2209.08179.
  25. CIN++: Enhancing Topological Message Passing. arXiv:2306.03561.
  26. Homologies of path complexes and digraphs. ArXiv:1207.2834 [math].
  27. Path Complexes and their Homologies. Journal of Mathematical Sciences, 248(5): 564–599.
  28. Grohe, M. 2017. Descriptive Complexity, Canonisation, and Definable Graph Structure Theory. Lecture Notes in Logic. Cambridge University Press.
  29. Pebble Games and Linear Equations. The Journal of Symbolic Logic, 80(3): 797–844.
  30. Exploring Network Structure, Dynamics, and Function using NetworkX. In Varoquaux, G.; Vaught, T.; and Millman, J., eds., Proceedings of the 7th Python in Science Conference, 11 – 15. Pasadena, CA USA.
  31. Topological Deep Learning: Going Beyond Graph Data. arXiv:2206.00606.
  32. Open Graph Benchmark: Datasets for Machine Learning on Graphs. In Proceedings of the 34th International Conference on Neural Information Processing Systems, NIPS’20. Red Hook, NY, USA: Curran Associates Inc. ISBN 9781713829546.
  33. UniGNN: a Unified Framework for Graph and Hypergraph Neural Networks. In Proceedings of the Thirtieth International Joint Conference on Artificial Intelligence, IJCAI-21.
  34. A Short Tutorial on The Weisfeiler-Lehman Test And Its Variants. In ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE.
  35. Semi-Supervised Classification with Graph Convolutional Networks. In International Conference on Learning Representations.
  36. Deeper Insights into Graph Convolutional Networks for Semi-Supervised Learning. In Proceedings of the Thirty-Second AAAI Conference on Artificial Intelligence and Thirtieth Innovative Applications of Artificial Intelligence Conference and Eighth AAAI Symposium on Educational Advances in Artificial Intelligence, AAAI’18/IAAI’18/EAAI’18. AAAI Press. ISBN 978-1-57735-800-8.
  37. Decoupled Weight Decay Regularization. In 7th International Conference on Learning Representations, ICLR 2019, New Orleans, LA, USA, May 6-9, 2019. OpenReview.net.
  38. Provably Powerful Graph Networks. In Wallach, H.; Larochelle, H.; Beygelzimer, A.; Alché-Buc, F. d.; Fox, E.; and Garnett, R., eds., Advances in Neural Information Processing Systems, volume 32. Curran Associates, Inc.
  39. Invariant and Equivariant Graph Networks. In 7th International Conference on Learning Representations, ICLR 2019, New Orleans, LA, USA, May 6-9, 2019. OpenReview.net.
  40. Path Neural Networks: Expressive and Accurate Graph Neural Networks. In Proceedings of the 40th International Conference on Machine Learning (ICML).
  41. TUDataset: A collection of benchmark datasets for learning with graphs. In ICML 2020 Workshop on Graph Representation Learning and Beyond (GRL+ 2020).
  42. Weisfeiler and Leman Go Sparse: Towards Scalable Higher-Order Graph Embeddings. In Proceedings of the 34th International Conference on Neural Information Processing Systems, NIPS’20. Red Hook, NY, USA: Curran Associates Inc. ISBN 9781713829546.
  43. Weisfeiler and Leman Go Neural: Higher-Order Graph Neural Networks. In Proceedings of the Thirty-Third AAAI Conference on Artificial Intelligence and Thirty-First Innovative Applications of Artificial Intelligence Conference and Ninth AAAI Symposium on Educational Advances in Artificial Intelligence, AAAI’19/IAAI’19/EAAI’19. AAAI Press. ISBN 978-1-57735-809-1. Event-place: Honolulu, Hawaii, USA.
  44. Propagation kernels: efficient graph kernels from propagated information. Mach. Learn., 102(2): 209–245.
  45. Graph Neural Networks Exponentially Lose Expressive Power for Node Classification. In International Conference on Learning Representations.
  46. OpenAI. 2022. GPT-4. https://www.openai.com/gpt-4/. Accessed: 2023-12-31.
  47. Architectures of Topological Deep Learning: A Survey on Topological Neural Networks. ArXiv:2304.10031 [cs].
  48. DropGNN: Random Dropouts Increase the Expressiveness of Graph Neural Networks. In 35th Conference on Neural Information Processing Systems (NeurIPS).
  49. PyTorch: An Imperative Style, High-Performance Deep Learning Library. In Advances in Neural Information Processing Systems 32, 8024–8035. Curran Associates, Inc.
  50. Peixoto, T. P. 2014. The graph-tool python library. figshare.
  51. Principled Simplicial Neural Networks for Trajectory Prediction. In Meila, M.; and 0001, T. Z., eds., Proceedings of the 38th International Conference on Machine Learning, ICML 2021, 18-24 July 2021, Virtual Event, volume 139 of Proceedings of Machine Learning Research, 9020–9029. PMLR.
  52. DropEdge: Towards Deep Graph Convolutional Networks on Node Classification. In International Conference on Learning Representations.
  53. Random Walks on Simplicial Complexes and the normalized Hodge 1-Laplacian. SIAM Review, 62(2): 353–391. ArXiv:1807.05044 [physics].
  54. Weisfeiler-Lehman Graph Kernels. Journal of Machine Learning Research, 12(77): 2539–2561.
  55. Efficient Graphlet Kernels for Large Graph Comparison. 12th International Conference on Artificial Intelligence and Statistics (AISTATS), Society for Artificial Intelligence and Statistics, 488-495 (2009), 5.
  56. Dropout: A Simple Way to Prevent Neural Networks from Overfitting. Journal of Machine Learning Research, 15(56): 1929–1958.
  57. ZINC 15 – Ligand Discovery for Everyone. Journal of Chemical Information and Modeling, 55(11): 2324–2337.
  58. Understanding over-squashing and bottlenecks on graphs via curvature. In International Conference on Learning Representations.
  59. Graph Attention Networks. In International Conference on Learning Representations (ICLR). ArXiv: 1710.10903 version: 3.
  60. A reduction of a graph to a canonical form and an algebra arising during this reduction. Nauchno-Technicheskaya Informatsiya.
  61. MoleculeNet: a benchmark for molecular machine learning. Chem. Sci., 9: 513–530.
  62. How Powerful are Graph Neural Networks? In International Conference on Learning Representations.
  63. Representation Learning on Graphs with Jumping Knowledge Networks. In Dy, J.; and Krause, A., eds., Proceedings of the 35th International Conference on Machine Learning, volume 80 of Proceedings of Machine Learning Research, 5453–5462. PMLR.
  64. HyperGCN: A New Method For Training Graph Convolutional Networks on Hypergraphs. In Advances in Neural Information Processing Systems (NeurIPS) 32, 1509–1520. Curran Associates, Inc.
  65. An End-to-End Deep Learning Architecture for Graph Classification. Proceedings of the AAAI Conference on Artificial Intelligence, 32(1).
Citations (6)

Summary

We haven't generated a summary for this paper yet.