Papers
Topics
Authors
Recent
2000 character limit reached

Hyperbolic Graph Embeddings: Models & Methods

Updated 28 December 2025
  • Hyperbolic graph embedding models are geometric techniques that exploit negatively curved spaces to capture hierarchical and scale-free network structures.
  • These models leverage Riemannian manifolds like the Poincaré ball and Lorentz hyperboloid, using neural, variational, and attention-based methods for efficient representation.
  • Applications span knowledge graph completion, anomaly detection, community modeling, and graph generation, consistently outperforming Euclidean approaches.

Hyperbolic graph embedding models are a class of geometric representation techniques that exploit the properties of negatively curved spaces to yield compact, low-distortion embeddings of networks, particularly those that exhibit hierarchical, scale-free, or power-law structure. Unlike Euclidean approaches, hyperbolic methods harness exponential volume growth and strong triangle inequalities to faithfully model the underlying graph topology. The foundational models leverage Riemannian manifolds such as the Poincaré ball or the Lorentz hyperboloid, deploying specialized neural, variational, and analytical methods for learning node, edge, community, and relation representations directly on these manifolds. Hyperbolic graph embedding advances knowledge graph completion, community detection, anomaly detection, graph generation, and heterogeneous graph modeling by integrating techniques ranging from hyperbolic rotations and attention to contrastive learning within manifold-aware optimization frameworks.

1. Mathematical Foundations and Hyperbolic Manifolds

Hyperbolic spaces are complete, simply-connected Riemannian manifolds with constant negative curvature. The most widely utilized models are:

  • Poincaré Ball Model: Defined as Bn={xRn:x<1}\mathbb{B}^n = \{ x\in\mathbb{R}^n : \|x\|<1 \}, equipped with a metric tensor gxB=λx2Ing_{x}^{\mathbb{B}} = \lambda_x^2 I_n, with %%%%2%%%% and geodesic distance

dB(u,v)=arcosh(1+2uv2(1u2)(1v2))d_{\mathbb{B}}(u,v) = \operatorname{arcosh}\left(1 + 2\frac{\|u-v\|^2}{(1-\|u\|^2)(1-\|v\|^2)}\right)

(Sadat et al., 21 Dec 2025, Chamberlain et al., 2017).

  • Lorentz (Hyperboloid) Model: Manifold Lcn={xRn+1:x,xL=1/c,x0>0}\mathcal{L}^n_c = \{ \mathbf{x} \in \mathbb{R}^{n+1} : \langle\mathbf{x},\mathbf{x}\rangle_{\mathcal{L}} = -1/c, x_0 > 0 \} with metric tensor gL=diag(1,1,...,1)g_{\mathcal{L}} = \operatorname{diag}(-1, 1, ..., 1). The Lorentzian inner product is x0y0+i=1nxiyi-x_0y_0 + \sum_{i=1}^{n} x_i y_i and squared geodesic distance dL2(x,y)=2c2x,yLd_{\mathcal{L}}^2(\mathbf{x},\mathbf{y}) = \frac{2}{c} - 2 \langle \mathbf{x}, \mathbf{y} \rangle_{\mathcal{L}} (Liang et al., 6 Nov 2024, Wang et al., 2020).

Tangent spaces, exponential and logarithmic maps enable mappings between Euclidean parameters and manifold points, critical for both initialization and operations such as Möbius addition and matrix-vector multiplication.

2. Hyperbolic Knowledge Graph Embedding Methods

Hyperbolic KGE approaches leverage negative curvature to model both local relational patterns and global hierarchical structures:

  • Fully Hyperbolic Rotation (FHRE) uses the Lorentz model to represent entities as points on Lcn\mathcal{L}^n_c and relations as block-diagonal Lorentz rotations acting directly on the manifold. Scoring functions rely on the negative squared Lorentz distance. This direct manifold-level approach eschews repeated exponential/logarithmic mappings, reducing numerical instability and enabling efficient Riemannian optimization via Adam. FHRE achieves state-of-the-art MRRs on CoDEx-s/m and strong scores with fewer parameters compared to Euclidean or partially hyperbolic baselines (Liang et al., 6 Nov 2024).
  • AttH (Attention Hyperbolic KGE) formalizes rotations and reflections as block-diagonal Givens isometries in the Poincaré ball, applies attention for mixed-relational patterns, and trains per-relation curvature parameters. Möbius addition and scoring are used for triple evaluation. Learned curvatures and attention operators enable faithful modeling of both hierarchy and logical relation types; low-dimensional AttH surpasses Euclidean and prior hyperbolic models for most KG benchmarks (Chami et al., 2020).
  • 3H-TH introduces quaternionic 3D rotation (Hamilton products) combined with Möbius translation in the Poincaré ball, allowing comprehensive modeling of symmetry, antisymmetry, inversion, commutative/non-commutative composition, hierarchy, and multiplicity in KG relations. Relation-specific curvature parameters further enhance hierarchical modeling (Zhu et al., 2023).
  • Complex Hyperbolic FFT KGE exploits FFT/IFFT to translate between complex and real hyperbolic spaces, supporting efficient attention and isometric transformations for multi-relational reasoning (Xiao et al., 2022).

These methods demonstrate that explicit manifold-respecting operators (e.g., Lorentz rotations, Möbius addition) and curvature learning are essential for simultaneously achieving hierarchy fidelity and logical expressiveness in low dimensions.

3. Hyperbolic Neural and Variational Architectures for Graphs

Hyperbolic graph neural networks (GNNs) and autoencoder models generalize classical spectral and message-passing frameworks:

  • Hyperbolic Graph Convolutional Network (HGCN): Utilizes the Lorentz model with per-layer curvature; performs log-exp mapping for tangent-space linear transforms and aggregation, enabling inductive node representations that preserve hierarchy and scale-free topology. Layer-wise curvature adaptation is integral (Chami et al., 2019, Sadat et al., 21 Dec 2025).
  • Hyperbolic-to-Hyperbolic GCN (H2H-GCN): All operations reside natively on the Lorentz manifold, including neighborhood aggregation via the Einstein midpoint, removing reliance on tangent-space approximation (Sadat et al., 21 Dec 2025).
  • Hyperbolic Graph Autoencoders (HGCAE, Poincaré-VAE): Latent manifolds are structured as Poincaré balls or Lorentz hyperboloids; encoders use exponential map layers; decoders implement gyroplane or tangent-space projections. The VAE objective leverages wrapped normals, with optimization via Riemannian Adam (Sadat et al., 21 Dec 2025, Rezaabad et al., 2020).
  • Semi-Implicit Variational Inference in Hyperbolic Space: Enhanced SI-VAE approaches mitigate naïve posterior collapse by adding mutual-information regularization, leading to greater retention of input–latent correlations and improved edge prediction and node categorization (Rezaabad et al., 2020).

Theoretical analyses show that these methods yield compact, low-distortion, high-fidelity embeddings and outperform Euclidean analogues for tasks hinging on global graph consistency (e.g., link prediction, anomaly detection).

4. Models for Community, Role, and Heterogeneous Graph Embedding

Hyperbolic models support structural role, community detection, and heterogeneous graph tasks:

  • Hyperboloid struct2vec extends structural role identity embedding into hyperbolic space by generalizing structural similarity, multilayer random walks, and Riemannian optimization. Reduces computational cost and yields improved SVM classification accuracy on air-traffic networks compared to Euclidean struct2vec or node2vec (Wang et al., 2020).
  • Hyperbolic Community Embedding (H-GMM, H-K-Means): Embeddings reside in the Poincaré ball; communities are modeled by Riemannian Gaussian mixtures with EM. Supervised and unsupervised clustering results show that hyperbolic GMMs outperform Euclidean baselines at much lower dimension, indicating exponential separation efficiency (Gerald et al., 2019).
  • Hyperbolic Heterogeneous Graph Attention Networks (HHGAT, MSGAT, MHCL): These allocate multiple Poincaré balls with learnable curvature—one per metapath—to better fit diverse power-law graph substructures. Intra-space and inter-space attention mechanisms ensure that learned node representations aggregate information both structurally and semantically. Multi-space models (MSGAT, MHCL) with metapath-wise curvature markedly outperform single-space (global curvature) models and Euclidean attention mechanisms (Park et al., 18 Nov 2024, Park et al., 15 Apr 2024, Park et al., 20 Jun 2025). MHCL further deploys hyperbolic contrastive learning to maximally separate embeddings by metapath, optimizing discriminability.

5. Optimization, Numerical Stability, and Practical Considerations

Riemannian manifold optimization is a defining feature:

  • Riemannian Adam and SGD: All model parameters—entity/role/attention vectors and curvatures—are updated using manifold-aware optimizers. Updates use log, exp, and tangent-space projections, with retraction via exponential maps ensuring that parameters remain on the manifold (Liang et al., 6 Nov 2024, Chami et al., 2019, Sadat et al., 21 Dec 2025, Wang et al., 2020).
  • Curvature Learning: Adaptation of curvature per layer, relation, or metapath is critical. Models with fixed global curvature can not simultaneously fit the branching factors or semantic diversity of many real graphs (Sadat et al., 21 Dec 2025, Chami et al., 2020, Park et al., 18 Nov 2024).
  • Parameter and Time Complexity: Manifold-respecting operations (Möbius addition, Lorentz rotations) incur more computational cost than standard Euclidean linear algebra but offer lower parameter overhead and greater representational power (particularly in low dimensions) (Liang et al., 6 Nov 2024, Chami et al., 2020, Zhu et al., 2023).
  • Numerical Stability: Boundary drift and mapping instability are addressed by limiting repeated exp/log calls, keeping explicit maps to a minimum, clipping updates, and, where possible, working in tangent space and applying retraction only once per epoch (Liang et al., 6 Nov 2024, Yu et al., 2022).

6. Applications, Empirical Performance, and Open Questions

Hyperbolic graph embedding models have advanced the state-of-the-art in several domains:

Task Type Hyperbolic Model Best Reported Performance Reference
KG Completion (MRR) FHRE CoDEx-s: 0.598, CoDEx-m: 0.391 (Liang et al., 6 Nov 2024)
KG Completion (MRR) AttH WN18RR: 49.6%, YAGO3-10: 57.7% (Chami et al., 2020)
Community Detection H-GMM in 2D DBLP: Precision@1 ≈ 79% (Gerald et al., 2019)
Node Role SVM Hyperboloid struct2vec Brazilian: 0.780, American: 0.670 (Wang et al., 2020)
Anomaly Detection (F1) Poincaré-VAE Elliptic: 94% (Sadat et al., 21 Dec 2025)
Node Classification HGCL (contrastive) Cora: 82.4%, Disease: 93.4% (Liu et al., 2022)

Empirical results show that hyperbolic models uniformly outperform Euclidean baselines on hierarchical and scale-free graphs, both in low-dimensional and in parameter-constrained regimes. Hyperbolic embeddings facilitate anomaly detection by amplifying deviations from tree-like manifold structure, enable effective role separation and compact community modeling, and yield principled encoding of logical KG relations. Open questions include further scaling Riemannian optimization, dynamic or adaptive curvature learning, interpretability of learned geometries per relation/metapath, extension to heterogeneous or dynamic graphs, and integration of self-supervision and generative modeling in hyperbolic latent spaces (Sadat et al., 21 Dec 2025, Wen et al., 2023).

7. Future Directions and Theoretical Extensions

Recent research highlights several directions for further development:

  • Full Lorentz Group Operations: Extending beyond rotations to boosts and reflections in the Lorentz model may enable richer relational modeling in KGs (Liang et al., 6 Nov 2024).
  • Mixed-Curvature Product Spaces: Allocation of varying curvature geometries to different graph subregions, or adaptive curvature block decompositions, is a promising method for capturing heterogeneous branching and semantic patterns (Park et al., 18 Nov 2024, Park et al., 20 Jun 2025).
  • Contrastive and Self-Supervised Learning: Integration of contrastive learning and position-consistency constraints in hyperbolic space enhances representation power and generalization (Liu et al., 2022, Park et al., 20 Jun 2025).
  • Hyperbolic Generative Models: Diffusion models and variational architectures leveraging hyperbolic latent spaces have demonstrated improved graph generation fidelity for power-law and molecular domains (Wen et al., 2023).
  • Open-Source Libraries: The development of toolkits such as Ghypeddings (Sadat et al., 21 Dec 2025) supports broader experimentation and application, including manifold neural modules, Riemannian optimizers, and downstream anomaly or classification wrappers.

This body of work establishes hyperbolic graph embedding as a foundational technique for metric-consistent, hierarchy-respecting network representation, suggesting manifold-level modeling as a design principle for both theoretical models and practical graph machine learning systems.

Whiteboard

Topic to Video (Beta)

Follow Topic

Get notified by email when new papers are published related to Hyperbolic Graph Embedding Models.