Papers
Topics
Authors
Recent
2000 character limit reached

Higher-order Graph Neural Networks

Updated 23 November 2025
  • Higher-order Graph Neural Networks (HOGNNs) are models that encode polyadic interactions beyond simple pairwise node relationships.
  • They leverage combinatorial structures such as hyperedges, motifs, and subgraphs to enhance expressivity and capture intricate network dynamics.
  • HOGNNs improve practical tasks like molecular property prediction and social network analysis by modeling higher-order relational data.

Higher-order Graph Neural Networks (HOGNNs) comprise a broad class of architectures that explicitly encode and leverage polyadic interactions—relations among more than two nodes—into the neural computation on graphs. These models move beyond simple pairwise edges, accessing and propagating information over larger-scale combinatorial structures such as hyperedges, node tuples, induced subgraphs, motifs, simplices, and cells. The explicit modeling of higher-order relationships fundamentally increases both the expressiveness and the structural inductive bias of GNNs, often aligning them with—and in advanced cases surpassing—the k-dimensional Weisfeiler-Leman (k-WL) graph isomorphism hierarchy (Besta et al., 2024).

1. Mathematical Foundations and Higher-Order Message Passing

A HOGNN is fundamentally characterized by its use of higher-order combinatorial objects and its associated message-passing paradigm. For a standard graph G=(V,E)G=(V,E) and adjacency matrix A{0,1}n×nA \in \{0,1\}^{n \times n}, higher-order constructs take the form of adjacency tensors A(k){0,1}n××nA^{(k)} \in \{0,1\}^{n \times \cdots \times n} (with kk modes), whose entries Ai1,,ik(k)A^{(k)}_{i_1,\ldots,i_k} indicate polyadic relations, such as membership in a hyperedge or a simplex. The most general HOGNN update at layer +1\ell+1 is:

hv(+1)=ϕ({ψ(hu1(),,huk(),Au1ukv(k))(u1,,uk)N(k)(v)})h_v^{(\ell+1)} = \phi\bigl(\bigl\{\, \psi(h_{u_1}^{(\ell)},\dots,h_{u_k}^{(\ell)},\,A^{(k)}_{u_1\cdots u_k v})\,|\, (u_1,\dots,u_k)\in\mathcal{N}^{(k)}(v) \bigr\}\bigr)

where N(k)(v)\mathcal{N}^{(k)}(v) collects all kk-tuples relevant to vv (e.g., all kk-simplices containing vv), and ψ\psi and ϕ\phi are user-specified message and aggregation functions. This framework subsumes models built on hypergraphs, node tuples, motifs, subgraphs, simplicial or cellular complexes (Besta et al., 2024).

Specializations include:

  • Hypergraph GNNs: Messages between nodes and hyperedges (incidence-based propagation).
  • Simplicial/Cell Complex GNNs: Hierarchical message passing across lower/upper adjacencies, boundary/coboundary relations.
  • Tuple or subgraph-based HOGNNs: Explicit message-passing over node kk-tuples or induced subgraphs, as in kk-GNNs (Morris et al., 2018), Subgraph GNNs, or recursive pooling architectures (Tahmasebi et al., 2020).

2. Core Taxonomy of HOGNN Models

Recent comprehensive analyses (Besta et al., 2024) distinguish HOGNN families by two axes:

  • Higher-Order Graph Data Model (HOGDM): The combinatorial objects governing relations—hypergraphs (arbitrary subsets), simplicial complexes (down-closed collections), cell complexes (hierarchical non-simplex cells), node-tuple collections, subgraph collections, motifs, and nested structures.
  • Message Passing Wiring: The set of communication channels (adjacency structures): node–hyperedge incidence (IMP), boundary/co-boundary (BAMP), down-neighboring for kk-tuples (DAMP), multi-hop or motif-based adjacency.

This taxonomy clarifies the distinctions among classical hypergraph GNNs (Feng et al., 2018), motif/structural encodings (Lee et al., 2018, Duval et al., 2022), k-GNNs (Morris et al., 2018), subgraph-based approaches, topological (simplicial/cellular) GNNs (Giusti, 2024), path-based aggregation (Flam-Shepherd et al., 2020), and mixed paradigms such as hybrid low/high-order convolutions (Lei et al., 2019). The motif-convolution approach, for example, defines higher-order propagation via parameterized motif-matrices, allowing for node-specific selection and attention among motifs and diffusion scales (Lee et al., 2018, Duval et al., 2022).

3. Representative HOGNN Architectures and Their Expressivity

Several canonical HOGNN architectures exemplify the spectrum of modeling higher-order relations:

Model Type Combinatorial Domain Expressivity / Theoretical Bound
Hypergraph GNN Hyperedges (arbitrary) Generalizes GCN; can model group-wise relations (Feng et al., 2018)
Simplicial/Cell GNN Simplicial/cell complexes Capable of Hodge homology/CWL-level reasoning (Giusti, 2024)
kk-GNN (tuple-based) kk-node tuples Matches kk-WL hierarchy (Morris et al., 2018)
Path-GNN/Path-MPNN Simple paths (length-kk) Exceeds 1-WL for k>1k>1; includes geometric invariants (Flam-Shepherd et al., 2020)
Motif-GNN Chosen motifs (triangles, ...) Diffusion via motif structure (Lee et al., 2018)
Derivative GNN (HOD-GNN) Node feature derivatives Expressivity matches subgraph GNN to kk-WL (Eitan et al., 2 Oct 2025)

The kk-GNN of Morris et al. (Morris et al., 2018) uses node-kk-tuple features, with message passing that mimics kk-WL color refinement; it achieves expressivity equal to kk-WL, strictly subsuming standard GNNs. The High-Order Derivative GNN (HOD-GNN) notably achieves the full expressivity of kk-order subgraph aggregation GNNs by leveraging high-order partial derivatives of a base MPNN with respect to node features, integrating Taylor expansion theory for marked node identification (Eitan et al., 2 Oct 2025).

Recursive pooling approaches, such as RNP-GNN (Tahmasebi et al., 2020), can count subgraphs of size kk and interpolate between simple message passing and full kk-WL, trading complexity for targeted expressive power.

4. Algorithmic Frameworks and Implementations

Practical instantiations of HOGNNs face significant computational and memory challenges due to the exponential growth in the number of higher-order objects (O(nk)O(n^k) for kk-tuples, O(nd+1)O(n^{d+1}) for dd-simplices). To address this:

  • Sparse and recursive pooling architectures exploit graph sparsity to reduce computational overhead (Eitan et al., 2 Oct 2025, Tahmasebi et al., 2020).
  • Tools such as PyTorch Geometric High Order (PyGHO) (Wang et al., 2023) provide unified abstractions for high-order data structures (Masked/Sparse Tensor), message passing across arbitrary dimensions, and high-order batching, enabling fast prototyping and minimizing boilerplate for kk-tuple and subgraph-based architectures.
  • Specialized operators and encoders (e.g., IGN for derivative tensors) support expressivity-aligned design.

5. Empirical Performance, Use Cases, and Limitations

Empirical studies report that HOGNNs yield consistent and often substantial gains over classical GNNs in tasks where higher-order structure is significant. Examples include:

  • Molecular property prediction: Explicit geometric and substructure awareness yields state-of-the-art results in chemical datasets, outperforming both GCN and MPNN baselines (Flam-Shepherd et al., 2020, KC et al., 2020, Eitan et al., 2 Oct 2025).
  • Social, collaboration, and biological networks: Hypergraph and simplicial models capture complex group interactions and community structure, leading to higher clustering fidelity and improved node classification accuracy (Feng et al., 2018, Duval et al., 2022).
  • Motif-based and topological architectures improve robustness, clustering quality, and classification performance, especially on tasks sensitive to network motifs, cycles, or higher homology (Lee et al., 2018, Buffelli et al., 2024).

Trade-offs center on computational cost versus expressivity. Complete kk-GNN or subgraph GNN architectures are typically feasible for small kk and graphs but become impractical otherwise, motivating sampling, recursive, or sparsity-aware reductions.

6. Theoretical Insights and Recent Directions

Contemporary research has clarified key theoretical properties of HOGNN families:

  • The alignment of k-GNN, Subgraph GNNs, and HOD-GNNs with the kk-WL hierarchy defines a sequence of strictly increasing expressivity classes (Eitan et al., 2 Oct 2025, Morris et al., 2018).
  • Invariant Graphon Networks extend these constructions to graph limits, showing that expressivity and transferability hold in dense graphon regimes, with explicit universal approximation results (Herbst et al., 18 Mar 2025).
  • Topological neural models (simplicial/cellular) demonstrate improved information flow and mitigate over-squashing by providing higher-dimensional “shortcuts” in computational graphs (Giusti, 2024).
  • Path-, motif-, and multi-scale architectures enable flexible modeling of both local and global higher-order dependencies (Lee et al., 2018).

Open challenges concern flexible, data-driven construction of HOGDMs, scalable and memory-efficient computation, extension to dynamic and heterogeneous domains, and systematic benchmarking and comparison (Besta et al., 2024).

7. Summary Table of Major HOGNN Families

Family Principal Object Message Passing Domain Expressivity (relative to WL) Computational Cost
Hypergraph GNN Hyperedges Node–hyperedge incidence Exceeds 1-WL in group interaction Linear in #nodes+edges
k-GNN Node kk-tuples kk-tuple adjacency Matches kk-WL O(nk)O(n^k) per layer
Subgraph GNN Induced subgraphs Subgraph message passing Matches kk-WL (for subgraphs of size kk) O(nk)O(n^k)
Simplicial GNN Simplices Boundary/coboundary Capable of Hodge/CWL inference Linear in #simplices
Motif-GNN Motif instances Motif-based adjacency Surpasses 1-WL Motif enumeration + GCN
HOD-GNN Node feature derivatives Node + derivative encoding Matches k-OSAN/Subgraph GNN (kk-WL) Sparse: O(dmaxnst1)O(d_{max} n s_{t-1})
Path-GNN Simple paths k-path aggregation Exceeds 1-WL, context- and geometry-aware Path enumeration (pruning for large kk)
Topological NN Simplicial/cell complexes Multi-dimensional MP Beyond 1-WL, homology/topology-aware Depends on complex size

In conclusion, Higher-Order Graph Neural Networks provide a rich and theoretically grounded toolkit for modeling, reasoning, and learning over graph-structured data enriched with polyadic, combinatorial, and topological structure, with architectures and theoretical underpinnings that both subsume and extend the classic GNN paradigm (Besta et al., 2024, Eitan et al., 2 Oct 2025, Morris et al., 2018, Giusti, 2024, Eitan et al., 2 Oct 2025, Herbst et al., 18 Mar 2025).

Whiteboard

Topic to Video (Beta)

Follow Topic

Get notified by email when new papers are published related to Higher-order Graph Neural Networks (HOGNNs).