Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Learning over Families of Sets -- Hypergraph Representation Learning for Higher Order Tasks (2101.07773v1)

Published 19 Jan 2021 in cs.LG and stat.ML

Abstract: Graph representation learning has made major strides over the past decade. However, in many relational domains, the input data are not suited for simple graph representations as the relationships between entities go beyond pairwise interactions. In such cases, the relationships in the data are better represented as hyperedges (set of entities) of a non-uniform hypergraph. While there have been works on principled methods for learning representations of nodes of a hypergraph, these approaches are limited in their applicability to tasks on non-uniform hypergraphs (hyperedges with different cardinalities). In this work, we exploit the incidence structure to develop a hypergraph neural network to learn provably expressive representations of variable sized hyperedges which preserve local-isomorphism in the line graph of the hypergraph, while also being invariant to permutations of its constituent vertices. Specifically, for a given vertex set, we propose frameworks for (1) hyperedge classification and (2) variable sized expansion of partially observed hyperedges which captures the higher order interactions among vertices and hyperedges. We evaluate performance on multiple real-world hypergraph datasets and demonstrate consistent, significant improvement in accuracy, over state-of-the-art models.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Balasubramaniam Srinivasan (12 papers)
  2. Da Zheng (50 papers)
  3. George Karypis (110 papers)
Citations (19)

Summary

We haven't generated a summary for this paper yet.