Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

RotatE: Knowledge Graph Embedding by Relational Rotation in Complex Space (1902.10197v1)

Published 26 Feb 2019 in cs.LG, cs.CL, and stat.ML

Abstract: We study the problem of learning representations of entities and relations in knowledge graphs for predicting missing links. The success of such a task heavily relies on the ability of modeling and inferring the patterns of (or between) the relations. In this paper, we present a new approach for knowledge graph embedding called RotatE, which is able to model and infer various relation patterns including: symmetry/antisymmetry, inversion, and composition. Specifically, the RotatE model defines each relation as a rotation from the source entity to the target entity in the complex vector space. In addition, we propose a novel self-adversarial negative sampling technique for efficiently and effectively training the RotatE model. Experimental results on multiple benchmark knowledge graphs show that the proposed RotatE model is not only scalable, but also able to infer and model various relation patterns and significantly outperform existing state-of-the-art models for link prediction.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Zhiqing Sun (35 papers)
  2. Zhi-Hong Deng (39 papers)
  3. Jian-Yun Nie (70 papers)
  4. Jian Tang (327 papers)
Citations (1,918)

Summary

  • The paper introduces RotatE, which models KG relations as rotations in complex space to capture symmetry, inversion, and composition patterns.
  • It incorporates a self-adversarial negative sampling technique to optimize training and enhance link prediction performance.
  • Empirical results on FB15k, WN18, FB15k-237, and WN18RR demonstrate RotatE's superior performance compared to state-of-the-art models.

RotatE: Knowledge Graph Embedding by Relational Rotation in Complex Space

The paper "RotatE: Knowledge Graph Embedding by Relational Rotation in Complex Space," authored by Zhiqing Sun et al., addresses the challenge of representing entities and relations in knowledge graphs (KGs) to predict missing links. The essence of the problem lies in effectively modeling and inferring relation patterns which are integral to link prediction. The authors introduce RotatE, a novel approach that defines each relation as a rotation in the complex vector space, thus enabling the model to capture a variety of relation patterns, including symmetry/antisymmetry, inversion, and composition.

Key Contributions

  1. RotatE Model:
    • Entity and Relation Representation: Entities are embedded as complex vectors, and relations are modeled as rotations in the complex plane. For a given triplet (h,r,t)(h, r, t), the model expects t=hrt = h \circ r, where h,tCkh, t \in \mathbb{C}^k and rCkr \in \mathbb{C}^k with r=1\|r\| = 1.
    • Relation Patterns: This rotation-based mechanism allows RotatE to infer and model various relation patterns:
      • Symmetry/Antisymmetry: Relations are symmetric if the phase of the embedding equals $0$ or π\pi.
      • Inversion: Relations exhibit inversion patterns if their embeddings are conjugates.
      • Composition: Complex multisystem relations can be modeled by combining embeddings through element-wise multiplication.
  2. Self-Adversarial Negative Sampling:
    • The authors propose a novel self-adversarial sampling technique that dynamically generates negative samples based on the current embeddings, optimizing the negative sampling process and thus enhancing training efficiency and model performance.
  3. Empirical Evaluation:
    • Extensive experiments show that RotatE significantly outperforms state-of-the-art models on several benchmark datasets (FB15k, WN18, FB15k-237, and WN18RR). The results underscore RotatE’s capability to effectively model all tested relation patterns and its superior performance in various KG completion tasks.

Experimental Results

The empirical results provide compelling evidence of RotatE’s efficacy. On the FB15k and WN18 datasets, RotatE achieves mean reciprocal rank (MRR) scores of 0.797 and 0.949, respectively, surpassing previous state-of-the-art models like ComplEx and ConvE. Similarly, on datasets that demand more complex pattern inference such as FB15k-237 and WN18RR, RotatE secures MRR scores of 0.338 and 0.476, again outperforming current benchmarks.

Theoretical and Practical Implications

Theoretically, the deployment of rotations in complex space as a modeling technique for relations is both innovative and effective. The inherent structures of KGs often reflect complex interactions that are well-captured by the rotational symmetry in RotatE. In practice, the model’s scalability and robustness make it a potent tool for a wide array of applications including question-answering systems, information retrieval, and recommendation systems.

Future Directions

The flexibility and power of embedding-based methods like RotatE hold promise for further advancements in KG embeddings:

  1. Integration with Probabilistic Models: Extending RotatE to a probabilistic framework could address model uncertainties and provide more robust performance across diverse datasets.
  2. Real-world Applications: Application of RotatE in industrial-scale KGs such as Google’s Knowledge Graph or OpenAI’s GPT could test the model’s scalability and integration capabilities.
  3. Enhanced Adversarial Training: Expanding the self-adversarial sampling technique could further push the boundaries of model optimization and efficiency.

In summary, RotatE introduces a significant advancement in the domain of knowledge graph embeddings by leveraging relational rotations in complex space. Its robust performance across various benchmarks, coupled with its elegant theoretical foundation, makes it a valuable contribution to the field of AI and machine learning.

X Twitter Logo Streamline Icon: https://streamlinehq.com