Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
126 tokens/sec
GPT-4o
28 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Expressive Higher-Order Link Prediction through Hypergraph Symmetry Breaking (2402.11339v2)

Published 17 Feb 2024 in cs.LG and stat.ML

Abstract: A hypergraph consists of a set of nodes along with a collection of subsets of the nodes called hyperedges. Higher-order link prediction is the task of predicting the existence of a missing hyperedge in a hypergraph. A hyperedge representation learned for higher order link prediction is fully expressive when it does not lose distinguishing power up to an isomorphism. Many existing hypergraph representation learners, are bounded in expressive power by the Generalized Weisfeiler Lehman-1 (GWL-1) algorithm, a generalization of the Weisfeiler Lehman-1 algorithm. However, GWL-1 has limited expressive power. In fact, induced subhypergraphs with identical GWL-1 valued nodes are indistinguishable. Furthermore, message passing on hypergraphs can already be computationally expensive, especially on GPU memory. To address these limitations, we devise a preprocessing algorithm that can identify certain regular subhypergraphs exhibiting symmetry. Our preprocessing algorithm runs once with complexity the size of the input hypergraph. During training, we randomly replace subhypergraphs identified by the algorithm with covering hyperedges to break symmetry. We show that our method improves the expressivity of GWL-1. Our extensive experiments also demonstrate the effectiveness of our approach for higher-order link prediction on both graph and hypergraph datasets with negligible change in computation.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (85)
  1. Higher order learning with graphs. In Proceedings of the 23rd international conference on Machine learning, pp.  17–24, 2006.
  2. Subgraph neural networks. Advances in Neural Information Processing Systems, 33:8017–8029, 2020.
  3. Clustering in graphs and hypergraphs with categorical edge labels. In Proceedings of the Web Conference, 2020a.
  4. Fair clustering for diverse and experienced groups. arXiv:2006.05645, 2020b.
  5. Hypersage: Generalizing inductive representation learning on hypergraphs. arXiv preprint arXiv:2010.04558, 2020.
  6. Hypergraph convolution and hypergraph attention. Pattern Recognition, 110:107637, 2021.
  7. The dropout learning algorithm. Artificial intelligence, 210:78–122, 2014.
  8. Laplacian eigenmaps for dimensionality reduction and data representation. Neural computation, 15(6):1373–1396, 2003.
  9. Simplicial closure and higher-order link prediction. Proceedings of the National Academy of Sciences, 2018a. ISSN 0027-8424. doi: 10.1073/pnas.1800683115.
  10. Simplicial closure and higher-order link prediction. Proceedings of the National Academy of Sciences, 115(48):E11221–E11230, 2018b.
  11. Garrett Birkhoff. Lattice theory, volume 25. American Mathematical Soc., 1940.
  12. Weisfeiler and lehman go cellular: Cw networks. Advances in Neural Information Processing Systems, 34:2625–2640, 2021.
  13. Translating embeddings for modeling multi-relational data. In C.J. Burges, L. Bottou, M. Welling, Z. Ghahramani, and K.Q. Weinberger (eds.), Advances in Neural Information Processing Systems, volume 26. Curran Associates, Inc., 2013. URL https://proceedings.neurips.cc/paper_files/paper/2013/file/1cecc7a77928ca8133fa24680a88d2f9-Paper.pdf.
  14. Improving graph neural network expressivity via subgraph isomorphism counting. IEEE Transactions on Pattern Analysis and Machine Intelligence, 45(1):657–668, 2022.
  15. Hypergraph structure learning for hypergraph neural networks. In Proceedings of the Thirty-First International Joint Conference on Artificial Intelligence, IJCAI-22, pp.  1923–1929, 2022.
  16. A survey on hyperlink prediction. arXiv preprint arXiv:2207.02911, 2022.
  17. Simple and deep graph convolutional networks. In International conference on machine learning, pp. 1725–1735. PMLR, 2020.
  18. Weisfeiler-lehman meets gromov-Wasserstein. In Kamalika Chaudhuri, Stefanie Jegelka, Le Song, Csaba Szepesvari, Gang Niu, and Sivan Sabato (eds.), Proceedings of the 39th International Conference on Machine Learning, volume 162 of Proceedings of Machine Learning Research, pp.  3371–3416. PMLR, 17–23 Jul 2022. URL https://proceedings.mlr.press/v162/chen22o.html.
  19. The weisfeiler-lehman distance: Reinterpretation and connection with gnns. arXiv preprint arXiv:2302.00713, 2023.
  20. You are allset: A multiset function framework for hypergraph neural networks. arXiv preprint arXiv:2106.13264, 2021.
  21. You are allset: A multiset function framework for hypergraph neural networks. In International Conference on Learning Representations, 2022. URL https://openreview.net/forum?id=hpBTIv2uy_E.
  22. Random walks on hypergraphs with edge-dependent vertex weights. In Kamalika Chaudhuri and Ruslan Salakhutdinov (eds.), Proceedings of the 36th International Conference on Machine Learning, volume 97 of Proceedings of Machine Learning Research, pp. 1172–1181. PMLR, 09–15 Jun 2019. URL https://proceedings.mlr.press/v97/chitra19a.html.
  23. Cycle to clique (cy2c) graph neural network: A sight to see beyond neighborhood aggregation. In The Eleventh International Conference on Learning Representations.
  24. Cognitive relevance of the community structure of the human brain functional coactivation network. Proceedings of the National Academy of Sciences, 110(28):11583–11588, 2013.
  25. Hnhn: Hypergraph networks with hyperedge neurons. arXiv preprint arXiv:2006.12278, 2020.
  26. Hypergraph neural networks. In Proceedings of the AAAI conference on artificial intelligence, volume 33, pp.  3558–3565, 2019.
  27. Hypergraph isomorphism computation. arXiv preprint arXiv:2307.14394, 2023.
  28. Hgnn+: General hypergraph neural networks. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2022.
  29. Predict then propagate: Graph neural networks meet personalized pagerank. arXiv preprint arXiv:1810.05997, 2018.
  30. Cell attention networks. arXiv preprint arXiv:2209.08179, 2022.
  31. Algebraic graph theory, volume 207. Springer Science & Business Media, 2001.
  32. Inductive representation learning on large graphs. Advances in neural information processing systems, 30, 2017.
  33. Allen Hatcher. Algebraic topology. Web, 2005.
  34. Two-dimensional weisfeiler-lehman graph neural networks for link prediction. arXiv preprint arXiv:2206.09567, 2022.
  35. Unignn: a unified framework for graph and hypergraph neural networks. arXiv preprint arXiv:2105.00956, 2021.
  36. An analysis of virtual nodes in graph neural networks for link prediction. In The First Learning on Graphs Conference, 2022.
  37. Equivariant hypergraph neural networks. In European Conference on Computer Vision, pp.  86–103. Springer, 2022.
  38. Semi-supervised classification with graph convolutional networks. arXiv preprint arXiv:1609.02907, 2016a.
  39. Variational graph auto-encoders. arXiv preprint arXiv:1611.07308, 2016b.
  40. Oliver Knill. A brouwer fixed-point theorem for graph endomorphisms. Fixed Point Theory and Applications, 2013(1):1–24, 2013.
  41. Universal covers, color refinement, and two-variable counting logic: Lower bounds for the depth. In 2015 30th Annual ACM/IEEE Symposium on Logic in Computer Science, pp.  689–700. IEEE, 2015.
  42. I’m me, we’re us, and i’m us: Tri-directional contrastive learning on hypergraphs. arXiv preprint arXiv:2206.04739, 2022.
  43. Graph evolution: Densification and shrinking diameters. ACM Transactions on Knowledge Discovery from Data, 1(1), 2007. doi: 10.1145/1217299.1217301. URL https://doi.org/10.1145/1217299.1217301.
  44. Link prediction in social networks based on hypergraph. In Proceedings of the 22nd international conference on world wide web, pp.  41–42, 2013.
  45. Hypergraph transformer neural networks. ACM Transactions on Knowledge Discovery from Data, 17(5):1–22, 2023a.
  46. Distance encoding: Design provably more powerful neural networks for graph representation learning. Advances in Neural Information Processing Systems, 33:4465–4478, 2020.
  47. Local vertex colouring graph neural networks. 2023b.
  48. Large scale learning on non-homophilous graphs: New benchmarks and strong simple methods. Advances in Neural Information Processing Systems, 34:20887–20902, 2021.
  49. Structural equivalence of individuals in social networks. The Journal of mathematical sociology, 1(1):49–80, 1971.
  50. Recommender systems. Physics reports, 519(1):1–49, 2012.
  51. Invariant and equivariant graph networks. arXiv preprint arXiv:1812.09902, 2018.
  52. Contact patterns in a high school: A comparison between data collected using wearable sensors, contact diaries and friendship surveys. PLOS ONE, 10(9):e0136497, 2015. doi: 10.1371/journal.pone.0136497. URL https://doi.org/10.1371/journal.pone.0136497.
  53. A theoretical comparison of graph neural network extensions. In International Conference on Machine Learning, pp. 17323–17345. PMLR, 2022.
  54. Dropgnn: Random dropouts increase the expressiveness of graph neural networks. Advances in Neural Information Processing Systems, 34:21997–22009, 2021.
  55. Rdf2vec: Rdf graph embeddings for data mining. In The Semantic Web–ISWC 2016: 15th International Semantic Web Conference, Kobe, Japan, October 17–21, 2016, Proceedings, Part I 15, pp. 498–514. Springer, 2016.
  56. Dropedge: Towards deep graph convolutional networks on node classification. arXiv preprint arXiv:1907.10903, 2019.
  57. A framework to generate hypergraphs with community structure. arXiv preprint arXiv:2212.08593, 22, 2023.
  58. Random features strengthen graph neural networks. In Proceedings of the 2021 SIAM international conference on data mining (SDM), pp.  333–341. SIAM, 2021.
  59. Wikilinks: A large-scale cross-document coreference corpus labeled via links to Wikipedia. Technical Report UM-CS-2012-015, 2012.
  60. An overview of microsoft academic service (MAS) and applications. In Proceedings of the 24th International Conference on World Wide Web. ACM Press, 2015. doi: 10.1145/2740908.2742839. URL https://doi.org/10.1145/2740908.2742839.
  61. On the equivalence between positional node embeddings and structural graph representations. arXiv preprint arXiv:1910.00452, 2019.
  62. Learning over families of sets-hypergraph representation learning for higher order tasks. In Proceedings of the 2021 SIAM International Conference on Data Mining (SDM), pp.  756–764. SIAM, 2021.
  63. High-resolution measurements of face-to-face contact patterns in a primary school. PloS one, 6(8):e23176, 2011.
  64. Bring your own view: Graph neural networks for link prediction with personalized subgraph selection. In Proceedings of the Sixteenth ACM International Conference on Web Search and Data Mining, pp.  625–633, 2023.
  65. Graph attention networks. arXiv preprint arXiv:1710.10903, 2017.
  66. Principled hyperedge prediction with structural spectral features and neural networks. arXiv preprint arXiv:2106.04292, 2021.
  67. Equivariant and stable positional encoding for more powerful graph neural networks. arXiv preprint arXiv:2203.00199, 2022.
  68. Improving graph neural networks on multi-node tasks with labeling tricks. arXiv preprint arXiv:2304.10074, 2023.
  69. Augmentations in hypergraph contrastive learning: Fabricated and generative. arXiv preprint arXiv:2210.03801, 2022.
  70. The reduction of a graph to canonical form and the algebra which appears therein. nti, Series, 2(9):12–16, 1968.
  71. A new perspective on" how graph neural networks go beyond weisfeiler-lehman?". In International Conference on Learning Representations, 2021.
  72. A comprehensive survey on graph neural networks. IEEE transactions on neural networks and learning systems, 32(1):4–24, 2020.
  73. How powerful are graph neural networks? arXiv preprint arXiv:1810.00826, 2018.
  74. Hypergcn: A new method for training graph convolutional networks on hypergraphs. Advances in neural information processing systems, 32, 2019.
  75. Local higher-order graph clustering. In Proceedings of the 23rd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. ACM Press, 2017. doi: 10.1145/3097983.3098069. URL https://doi.org/10.1145/3097983.3098069.
  76. Identity-aware graph neural networks. In Proceedings of the AAAI conference on artificial intelligence, volume 35, pp.  10737–10745, 2021.
  77. Substructure aware graph neural networks. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 37, pp.  11129–11137, 2023.
  78. Rethinking the expressive power of gnns via graph biconnectivity. arXiv preprint arXiv:2301.09505, 2023.
  79. Weisfeiler-lehman neural machine for link prediction. In Proceedings of the 23rd ACM SIGKDD international conference on knowledge discovery and data mining, pp.  575–583, 2017.
  80. Nested graph neural networks. In M. Ranzato, A. Beygelzimer, Y. Dauphin, P.S. Liang, and J. Wortman Vaughan (eds.), Advances in Neural Information Processing Systems, volume 34, pp.  15734–15747. Curran Associates, Inc., 2021. URL https://proceedings.neurips.cc/paper_files/paper/2021/file/8462a7c229aea03dde69da754c3bbcc4-Paper.pdf.
  81. Beyond link prediction: Predicting hyperlinks in adjacency space. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 32, 2018.
  82. Labeling trick: A theory of using graph neural networks for multi-node representation learning. Advances in Neural Information Processing Systems, 34:9061–9073, 2021.
  83. Hyper-sagnn: a self-attention based graph neural network for hypergraphs. arXiv preprint arXiv:1911.02613, 2019.
  84. Gefl: Extended filtration learning for graph classification. In Learning on Graphs Conference, pp.  16–1. PMLR, 2022.
  85. Graph neural networks: A review of methods and applications. AI open, 1:57–81, 2020.

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com