Papers
Topics
Authors
Recent
2000 character limit reached

Weisfeiler-Leman Dimension Overview

Updated 30 November 2025
  • Weisfeiler-Leman dimension is a measure that quantifies the minimum dimension required for the WL algorithm to uniquely distinguish graphs up to isomorphism.
  • It connects graph isomorphism, counting logic, and parameterized complexity, offering practical bounds and insights for algorithm design and machine learning.
  • Recent research leverages its role in graph neural networks, group theory, and combinatorial configuration, while open problems continue to drive theoretical advances.

The Weisfeiler-Leman dimension (WL-dimension) quantifies the minimal dimension kk such that the kk-dimensional Weisfeiler-Leman algorithm uniquely distinguishes a graph or combinatorial structure up to isomorphism. As such, it serves as a central measure of combinatorial and descriptive complexity, directly connecting graph isomorphism algorithms, coherent configurations, logic, and algebraic graph theory. The WL-dimension is deeply tied to the expressive power of fixed-variable first-order logics with counting, the pebble game paradigm, parameterized complexity, and recent developments in representation learning.

1. Definition and General Framework

The kk-dimensional WL algorithm WLk\mathrm{WL}_k acts on kk-tuples of vertices of a graph G=(V,E)G = (V, E), maintaining an isomorphism-invariant coloring. At initialization, each kk-tuple is colored according to its equality pattern and the induced subgraph. Iterative refinement recolors each tuple based on the multiset of colors of its one-coordinate neighbors. The process stabilizes to a final coloring, canonically partitioning VkV^k. Two graphs are WLk\mathrm{WL}_k-equivalent if their multisets of stabilized colors coincide.

WL-dimension dimWL(G)\dim_{\mathrm{WL}}(G) is the smallest kk for which GG is WLk\mathrm{WL}_k-distinguishable from all non-isomorphic graphs. Equivalently, it is the number of variables needed to define GG up to isomorphism in first-order logic with counting, or the number of pebbles required for Spoiler to win the bijective pebble game (Kiefer et al., 2024, Schneider et al., 2024). For graph parameters ff, WL-dimension is the least kk such that ff is invariant under WLk\mathrm{WL}_k-equivalence (Göbel et al., 2023).

2. Connections to Logic and Homomorphism Counts

There is a tight equivalence between the WL-dimension and descriptive complexity theory. The algorithm WLk\mathrm{WL}_k matches the expressive power of FOk+1[C]\mathrm{FO}^{k+1}[\mathbf{C}]—first-order logic with at most k+1k+1 variables and counting quantifiers (Brachter et al., 2020, Kiefer et al., 2024). For graph parameters such as the answer count of conjunctive queries (CQs), WLk\mathrm{WL}_k distinguishes graphs iff they differ in the number of homomorphisms from all graphs of treewidth at most kk (Göbel et al., 2023). For full CQs, the WL-dimension coincides with the treewidth of the pattern graph; for general CQs, the key is the semantic extension width sew(φ)\mathrm{sew}(\varphi), a combination of treewidth and quantified star size.

Structure/Class WL-dimension bound Reference
General nn-vertex graphs 0.25n+o(n)\leq 0.25n + o(n) (Kiefer et al., 2024)
General nn-vertex graphs 0.15n+o(n)\leq 0.15n + o(n) (Schneider et al., 2024)
Distance-hereditary graphs $2$ (Gavrilyuk et al., 2020)
Planar graphs 3\leq 3 (Kiefer et al., 2017, Li et al., 2023)
Polyhedral/3-connected planar graphs 2\leq 2 (if schurian) (Li et al., 2023)
Bounded genus gg 4g+3\leq 4g + 3 (Grohe et al., 2019)
Orientable genus gg 2g+3\leq 2g + 3 (Grohe et al., 2019)
Circulant graphs (order nn) Ω(n)+3\leq \Omega(n)+3 (Wu et al., 2024)
Circulant graphs (lower bound) clogn\geq c\sqrt{\log n} (Wu et al., 14 Jul 2025)
Permutation graphs 18\leq 18 (Guo et al., 2023)
Strongly regular graphs (Fon-Der-Flaass) 4\leq 4 (Cai et al., 2023)
Abelian groups $2$ (Brachter et al., 2021)
Finite groups (general) 5\leq 5 (Brachter et al., 2021)

3. Structural Bounds and Extremal Constructions

The WL-dimension is tightly linked to graph structural parameters. For graphs of treewidth kk, the dimension is at most kk (normalized variant: at most $3k+4$ for logarithmic-round stabilization (Levet et al., 2023), at most $4k+3$ in prior work). For graphs of genus gg, a linear upper bound $4g+3$ holds, or $2g+3$ on orientable surfaces (Grohe et al., 2019). For planar graphs, dimension $3$ is tight; for polyhedral graphs with schurian coherent configurations, it drops to $2$, yet no explicit example attains $3$ (Kiefer et al., 2017, Li et al., 2023). Abundant families of strongly regular graphs with constant dimension are now known (Cai et al., 2023).

Expander-based Cai–Fürer–Immerman (CFI) constructions provide the canonical lower-bound families: to distinguish all nn-vertex graphs, Ω(n)\Omega(n) variables are required, with random graphs falling into low-dimension regimes (Grohe et al., 2023, Schneider et al., 2024). For circulant graphs, the dimension is unbounded, but lies between Ω(logn)\Omega(\sqrt{\log n}) and O(logn)O(\log n) (Wu et al., 14 Jul 2025, Wu et al., 2024). For permutation graphs, the dimension is at most $18$ via modular decomposition and coherent configuration separability (Guo et al., 2023).

4. Complexity, Algorithms, and Separability

Determining WL-dimension is algorithmically and complexity-theoretically challenging. The decision problem "is dimWL(G)k\dim_{\mathrm{WL}}(G) \leq k?" is NP-hard, via reduction from TREEWIDTH and CFI encoding; this applies even for color multiplicity $4$ or less (Lichter et al., 2024). Fixed-kk algorithms exist in polynomial time for graphs of color multiplicity up to $5$, using coherent configuration separability, modular reduction and automorphism group calculations. For abelian color classes of arbitrary multiplicity, approximation algorithms run in nO(k)n^{O(k)} time and provide explicit bounds (Lichter et al., 2024).

Separability of the (coherent) configuration produced by kk-WL is central: a configuration is separable iff every algebraic isomorphism arises from a combinatorial isomorphism. WL-dimension is tightly coupled to the separability number; for permutation graphs, modular decomposition and the analysis of uniquely orientable components yield the s(WL2(X))6s(\mathrm{WL}_2(X)) \leq 6 bound (Guo et al., 2023).

5. Algorithmic and Parameterized Implications

The WL-dimension governs the parameterized complexity of a range of graph problems. For conjunctive query answer counting, the WL-dimension coincides with semantic extension width; thus, fixed-parameter tractability is equivalent to bounded WL-dimension for classes of queries (Göbel et al., 2023). For circulant graphs, isomorphism can be decided in nO(logn)n^{O(\log n)} time via the Ω(n)+3\Omega(n)+3 WL bound (Wu et al., 2024). For classes with bounded vertex cover tt, the dimension is at most (2/3)t+3(2/3)t+3 (Kiefer et al., 2024).

For distance-hereditary graphs, only $2$ dimensions are required, making conventional color refinement sufficient for isomorphism (Gavrilyuk et al., 2020). For polyhedral graphs with a schurian automorphism group, $2$-dimensional WL suffices. For general graphs, successive refinement and local reductions in the potential function enable breaking the linear upper bound barrier (Schneider et al., 2024).

6. Connections to Algebra, Group Theory, and Machine Learning

The WL algorithm has powerful applications in finite group theory: many classic isomorphism invariants—including the center, commutator subgroup, derived/upper/lower central series, socle, and composition factors—are detectable by low-dimensional WL algorithms, typically with k5k\leq5 (Brachter et al., 2021). Direct products increase the WL-dimension additively by at most $1$. The graph-to-group correspondence (Mekler’s construction) allows transfer of lower bounds from graphs to nilpotent class-$2$ exponent-pp groups, yet in specific cases the group WL-dimension is much smaller than the underlying graph construction would suggest (Brachter et al., 2020).

Recent work links WL-dimension to Graph Neural Networks (GNNs): order-kk GNNs can be no more powerful than kk-dimensional WL; thus, the minimal order necessary to count answers of CQs is exactly sew(φ)\mathrm{sew}(\varphi) (Göbel et al., 2023). Consequently, the dimension provides rigorous limits on GNN expressiveness.

7. Open Problems and Future Directions

Key unresolved questions and research frontiers include:

  • Asymptotic bounds for general graphs: The best lower bounds for nn-vertex graphs remain at Ω(n)\Omega(n), but the upper bound is now $0.15n + o(n)$; narrowing the gap or discovering sharper families is a major challenge (Schneider et al., 2024, Kiefer et al., 2024).
  • Circulant graphs and Cayley graph families: Determining whether circulant graph WL-dimension is truly O(logn)O(\log n) or can be improved, and mapping the relationship to group-theoretic invariants (Wu et al., 14 Jul 2025, Wu et al., 2024).
  • Broader combinatorial designs: Whether other abundant combinatorial structures (e.g., Latin squares, Steiner systems) admit constant or bounded WL-dimension (Cai et al., 2023).
  • Algorithmic extraction of short distinguishing dimension: Efficiently finding minimal kk for individual graphs is an open algorithmic problem (Schneider et al., 2024).
  • Extension to tensors and higher-arity relational structures: Suitable analogues of WL for general relational systems with higher arity remain largely unexplored (Göbel et al., 2023).
  • Descriptive complexity and logic: Further pinning down the variable requirements for FO+C definitions of graphs, and the interaction with counting quantifiers and circuit lower bounds (Kiefer et al., 2024).

The Weisfeiler-Leman dimension thus remains at the core of combinatorial, logical, and algorithmic graph theory, offering a precise technical lens to study isomorphism, symmetry, and structure at both theoretical and applied levels.

Whiteboard

Topic to Video (Beta)

Follow Topic

Get notified by email when new papers are published related to Weisfeiler-Leman Dimension.

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube