RicciKGE: Adaptive Geometry for Graph Embeddings
- RicciKGE is a knowledge graph embedding framework that dynamically adapts entity representations via extended Ricci flow and local discrete curvature adjustment.
- It iteratively updates both metric structure and distances, ensuring exponential curvature flattening and linear convergence for enhanced embedding fidelity.
- Empirical results demonstrate improved link prediction and node classification, validating the method’s ability to regularize heterogeneous graph geometries.
RicciKGE is a knowledge graph embedding framework that couples embedding optimization with local discrete Ricci curvature adjustment via an extended Ricci flow. Its central innovation is the dynamic co-evolution of entity representations and underlying geometry, enabling the embedding manifold to adapt continuously to the sharply heterogeneous curvature exhibited by real-world graph data. Unlike conventional approaches where all entities are projected onto a fixed homogeneous manifold (e.g., , , , or product spaces), RicciKGE modifies both distances and metric structure iteratively, regularized by local curvature and embedding loss gradients. Rigorous analysis establishes exponential flattening of curvature, linear convergence of distances, and improved predictive accuracy across benchmark link prediction and node classification tasks (Luo et al., 8 Dec 2025).
1. Motivation and Limitations of Homogeneous Embedding Geometries
Conventional knowledge graph embedding (KGE) models (e.g., TransE, RotatE, DistMult, AttH, GoldE) select a global host manifold in which all entity and relation vectors reside. Variants employing multi-curvature or product manifolds can represent certain complex patterns, but still lack fine-grained spatial adaptation. Empirical discrete Ricci curvature in knowledge graphs varies sharply by region: dense clusters, bottleneck chains, and hierarchical motifs each induce distinct local curvature. Overly rigid embedding geometries force positive curvature regions to be under-expanded in hyperbolic models, and negative curvature zones to be over-stretched in spherical or Euclidean embeddings. This mismatch perturbs true relational distances, diminishing KGE expressiveness for link prediction and classification (Luo et al., 8 Dec 2025).
2. Extended Ricci Flow Formulation
RicciKGE augments classical Ricci flow with explicit coupling to the KGE loss gradient, yielding the following continuous formulation for the manifold metric : where is the Ricci tensor, is the instantaneous KGE loss (function of Riemannian distance ), and is the coupling coefficient. In discrete graph settings, the metric is represented by edge weights with update: where is discrete Ricci curvature of edge , and is the product of loss gradients at endpoints. The coupled update jointly contracts/expands edges based on curvature signs and loss gradients, directly steering spatial regularity and relational fidelity (Luo et al., 8 Dec 2025).
3. Joint Update Dynamics for Embeddings and Geometry
Central to RicciKGE is the tightly linked evolution of entity embeddings and inter-entity distances. For triple :
- The distance is mapped to edge weight .
- Ricci flow steps adjust via local and loss gradients, inducing a targeted distance change .
- Rather than explicit distance adjustment, endpoint embeddings and are perturbed minimally () subject to the prescribed , yielding closed-form updates via Lagrange multipliers:
with determined to enforce cumulative . Entity updates are aggregated over all triples containing that entity (Luo et al., 8 Dec 2025).
4. Theoretical Guarantees: Curvature Flattening and Distance Convergence
The extended Ricci flow with loss coupling confers two main analytical guarantees under standard regularity conditions (volume, diameter, Sobolev/Poincaré bounds, Lipschitz control on loss gradients):
- Exponential decay of curvature: The Ricci energy decays exponentially provided respects a sharp upper bound, with Grönwall-type inequality ensuring for all edges. This drives the embedding manifold toward local Euclidean flatness (Luo et al., 8 Dec 2025).
- Linear convergence of embedding distances: When is -strongly convex, induced distance updates are equivalent to a perturbed gradient descent, with analytic contraction factor and summable perturbation, yielding at a linear rate to the unique global KGE objective minimizer (Luo et al., 8 Dec 2025).
5. Algorithmic Implementation
A RicciKGE epoch proceeds by iterating over triples:
- Compute current distances , edge weights , and discrete Ricci curvatures (using Wasserstein-1 between neighborhood measures).
- Evaluate loss gradients and derive optimal perturbations for entity embeddings.
- Aggregate updates across triples per entity; update all entity vectors.
- Recompute edge curvatures on the augmented graph.
- Optionally adapt and learning rates based on validation metrics.
Typical hyperparameter ranges: (bounded by theory), KGE learning rate , negative samples per positive 256–1024, early stopping patience of 10 epochs (Luo et al., 8 Dec 2025).
Representative Algorithmic Steps
| Step | Description | Mathematical Object |
|---|---|---|
| Distance | Compute | Riemannian metric |
| Curvature | Compute via Wasserstein-1 | Ollivier-Ricci curvature |
| Update | , by minimal perturbation | Loss gradient, Lagrange multipliers |
6. Empirical Performance and Parameter Sensitivity
RicciKGE consistently improves link prediction (e.g. TransE on WN18RR: MRR 0.700 → 0.705; DistMult on FB15K-237: 0.285 → 0.289) and node classification (e.g. PubMed: accuracy 89.97% → 90.57%) when injected as a regularization into multiple base KGE models and GNNs. Curvature-variance empirically decays faster than total loss, generally stabilizing within 50–75 epochs. The coupling coefficient is highly sensitive: too small underutilizes curvature information, while too large degrades convergence and accuracy; optimal correlates with theory-derived bounds. Ablation studies confirm both the utility and necessity of curvature-aware updates for expressive representation of local geometry (Luo et al., 8 Dec 2025).
7. Context, Significance, and Connections
RicciKGE provides a synthetic analytic mechanism for embedding adaptation to local heterogeneity, drawing foundational inspiration from Ricci flow in Riemannian geometry. Unlike approaches predicated on synthetic Ricci curvature bounds in metric measure spaces (Han, 2017), RicciKGE leverages a tight feedback loop between spatial regularization and relational loss optimization, and directly conditions metric evolution on predictive utility. A plausible implication is that data-driven extended Ricci flows may generalize to other relational learning domains displaying variable local curvature, such as dynamic graphs or sequence modeling evolutions. Theoretical analysis unambiguously establishes the dual flattening/convergence property: (i) exponential reduction of local curvature, (ii) strict linear contraction of embedding errors, ensuring robust global optima and geometric regularity (Luo et al., 8 Dec 2025).