Higher-order Graph Neural Networks
- Higher-order Graph Neural Networks (HOGNNs) are models that encode polyadic interactions beyond simple pairwise node relationships.
- They leverage combinatorial structures such as hyperedges, motifs, and subgraphs to enhance expressivity and capture intricate network dynamics.
- HOGNNs improve practical tasks like molecular property prediction and social network analysis by modeling higher-order relational data.
Higher-order Graph Neural Networks (HOGNNs) comprise a broad class of architectures that explicitly encode and leverage polyadic interactions—relations among more than two nodes—into the neural computation on graphs. These models move beyond simple pairwise edges, accessing and propagating information over larger-scale combinatorial structures such as hyperedges, node tuples, induced subgraphs, motifs, simplices, and cells. The explicit modeling of higher-order relationships fundamentally increases both the expressiveness and the structural inductive bias of GNNs, often aligning them with—and in advanced cases surpassing—the k-dimensional Weisfeiler-Leman (k-WL) graph isomorphism hierarchy (Besta et al., 2024).
1. Mathematical Foundations and Higher-Order Message Passing
A HOGNN is fundamentally characterized by its use of higher-order combinatorial objects and its associated message-passing paradigm. For a standard graph and adjacency matrix , higher-order constructs take the form of adjacency tensors (with modes), whose entries indicate polyadic relations, such as membership in a hyperedge or a simplex. The most general HOGNN update at layer is:
where collects all -tuples relevant to (e.g., all -simplices containing ), and and are user-specified message and aggregation functions. This framework subsumes models built on hypergraphs, node tuples, motifs, subgraphs, simplicial or cellular complexes (Besta et al., 2024).
Specializations include:
- Hypergraph GNNs: Messages between nodes and hyperedges (incidence-based propagation).
- Simplicial/Cell Complex GNNs: Hierarchical message passing across lower/upper adjacencies, boundary/coboundary relations.
- Tuple or subgraph-based HOGNNs: Explicit message-passing over node -tuples or induced subgraphs, as in -GNNs (Morris et al., 2018), Subgraph GNNs, or recursive pooling architectures (Tahmasebi et al., 2020).
2. Core Taxonomy of HOGNN Models
Recent comprehensive analyses (Besta et al., 2024) distinguish HOGNN families by two axes:
- Higher-Order Graph Data Model (HOGDM): The combinatorial objects governing relations—hypergraphs (arbitrary subsets), simplicial complexes (down-closed collections), cell complexes (hierarchical non-simplex cells), node-tuple collections, subgraph collections, motifs, and nested structures.
- Message Passing Wiring: The set of communication channels (adjacency structures): node–hyperedge incidence (IMP), boundary/co-boundary (BAMP), down-neighboring for -tuples (DAMP), multi-hop or motif-based adjacency.
This taxonomy clarifies the distinctions among classical hypergraph GNNs (Feng et al., 2018), motif/structural encodings (Lee et al., 2018, Duval et al., 2022), k-GNNs (Morris et al., 2018), subgraph-based approaches, topological (simplicial/cellular) GNNs (Giusti, 2024), path-based aggregation (Flam-Shepherd et al., 2020), and mixed paradigms such as hybrid low/high-order convolutions (Lei et al., 2019). The motif-convolution approach, for example, defines higher-order propagation via parameterized motif-matrices, allowing for node-specific selection and attention among motifs and diffusion scales (Lee et al., 2018, Duval et al., 2022).
3. Representative HOGNN Architectures and Their Expressivity
Several canonical HOGNN architectures exemplify the spectrum of modeling higher-order relations:
| Model Type | Combinatorial Domain | Expressivity / Theoretical Bound |
|---|---|---|
| Hypergraph GNN | Hyperedges (arbitrary) | Generalizes GCN; can model group-wise relations (Feng et al., 2018) |
| Simplicial/Cell GNN | Simplicial/cell complexes | Capable of Hodge homology/CWL-level reasoning (Giusti, 2024) |
| -GNN (tuple-based) | -node tuples | Matches -WL hierarchy (Morris et al., 2018) |
| Path-GNN/Path-MPNN | Simple paths (length-) | Exceeds 1-WL for ; includes geometric invariants (Flam-Shepherd et al., 2020) |
| Motif-GNN | Chosen motifs (triangles, ...) | Diffusion via motif structure (Lee et al., 2018) |
| Derivative GNN (HOD-GNN) | Node feature derivatives | Expressivity matches subgraph GNN to -WL (Eitan et al., 2 Oct 2025) |
The -GNN of Morris et al. (Morris et al., 2018) uses node--tuple features, with message passing that mimics -WL color refinement; it achieves expressivity equal to -WL, strictly subsuming standard GNNs. The High-Order Derivative GNN (HOD-GNN) notably achieves the full expressivity of -order subgraph aggregation GNNs by leveraging high-order partial derivatives of a base MPNN with respect to node features, integrating Taylor expansion theory for marked node identification (Eitan et al., 2 Oct 2025).
Recursive pooling approaches, such as RNP-GNN (Tahmasebi et al., 2020), can count subgraphs of size and interpolate between simple message passing and full -WL, trading complexity for targeted expressive power.
4. Algorithmic Frameworks and Implementations
Practical instantiations of HOGNNs face significant computational and memory challenges due to the exponential growth in the number of higher-order objects ( for -tuples, for -simplices). To address this:
- Sparse and recursive pooling architectures exploit graph sparsity to reduce computational overhead (Eitan et al., 2 Oct 2025, Tahmasebi et al., 2020).
- Tools such as PyTorch Geometric High Order (PyGHO) (Wang et al., 2023) provide unified abstractions for high-order data structures (Masked/Sparse Tensor), message passing across arbitrary dimensions, and high-order batching, enabling fast prototyping and minimizing boilerplate for -tuple and subgraph-based architectures.
- Specialized operators and encoders (e.g., IGN for derivative tensors) support expressivity-aligned design.
5. Empirical Performance, Use Cases, and Limitations
Empirical studies report that HOGNNs yield consistent and often substantial gains over classical GNNs in tasks where higher-order structure is significant. Examples include:
- Molecular property prediction: Explicit geometric and substructure awareness yields state-of-the-art results in chemical datasets, outperforming both GCN and MPNN baselines (Flam-Shepherd et al., 2020, KC et al., 2020, Eitan et al., 2 Oct 2025).
- Social, collaboration, and biological networks: Hypergraph and simplicial models capture complex group interactions and community structure, leading to higher clustering fidelity and improved node classification accuracy (Feng et al., 2018, Duval et al., 2022).
- Motif-based and topological architectures improve robustness, clustering quality, and classification performance, especially on tasks sensitive to network motifs, cycles, or higher homology (Lee et al., 2018, Buffelli et al., 2024).
Trade-offs center on computational cost versus expressivity. Complete -GNN or subgraph GNN architectures are typically feasible for small and graphs but become impractical otherwise, motivating sampling, recursive, or sparsity-aware reductions.
6. Theoretical Insights and Recent Directions
Contemporary research has clarified key theoretical properties of HOGNN families:
- The alignment of k-GNN, Subgraph GNNs, and HOD-GNNs with the -WL hierarchy defines a sequence of strictly increasing expressivity classes (Eitan et al., 2 Oct 2025, Morris et al., 2018).
- Invariant Graphon Networks extend these constructions to graph limits, showing that expressivity and transferability hold in dense graphon regimes, with explicit universal approximation results (Herbst et al., 18 Mar 2025).
- Topological neural models (simplicial/cellular) demonstrate improved information flow and mitigate over-squashing by providing higher-dimensional “shortcuts” in computational graphs (Giusti, 2024).
- Path-, motif-, and multi-scale architectures enable flexible modeling of both local and global higher-order dependencies (Lee et al., 2018).
Open challenges concern flexible, data-driven construction of HOGDMs, scalable and memory-efficient computation, extension to dynamic and heterogeneous domains, and systematic benchmarking and comparison (Besta et al., 2024).
7. Summary Table of Major HOGNN Families
| Family | Principal Object | Message Passing Domain | Expressivity (relative to WL) | Computational Cost |
|---|---|---|---|---|
| Hypergraph GNN | Hyperedges | Node–hyperedge incidence | Exceeds 1-WL in group interaction | Linear in #nodes+edges |
| k-GNN | Node -tuples | -tuple adjacency | Matches -WL | per layer |
| Subgraph GNN | Induced subgraphs | Subgraph message passing | Matches -WL (for subgraphs of size ) | |
| Simplicial GNN | Simplices | Boundary/coboundary | Capable of Hodge/CWL inference | Linear in #simplices |
| Motif-GNN | Motif instances | Motif-based adjacency | Surpasses 1-WL | Motif enumeration + GCN |
| HOD-GNN | Node feature derivatives | Node + derivative encoding | Matches k-OSAN/Subgraph GNN (-WL) | Sparse: |
| Path-GNN | Simple paths | k-path aggregation | Exceeds 1-WL, context- and geometry-aware | Path enumeration (pruning for large ) |
| Topological NN | Simplicial/cell complexes | Multi-dimensional MP | Beyond 1-WL, homology/topology-aware | Depends on complex size |
In conclusion, Higher-Order Graph Neural Networks provide a rich and theoretically grounded toolkit for modeling, reasoning, and learning over graph-structured data enriched with polyadic, combinatorial, and topological structure, with architectures and theoretical underpinnings that both subsume and extend the classic GNN paradigm (Besta et al., 2024, Eitan et al., 2 Oct 2025, Morris et al., 2018, Giusti, 2024, Eitan et al., 2 Oct 2025, Herbst et al., 18 Mar 2025).