Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

ModulE: Module Embedding for Knowledge Graphs (2203.04702v1)

Published 9 Mar 2022 in cs.AI

Abstract: Knowledge graph embedding (KGE) has been shown to be a powerful tool for predicting missing links of a knowledge graph. However, existing methods mainly focus on modeling relation patterns, while simply embed entities to vector spaces, such as real field, complex field and quaternion space. To model the embedding space from a more rigorous and theoretical perspective, we propose a novel general group theory-based embedding framework for rotation-based models, in which both entities and relations are embedded as group elements. Furthermore, in order to explore more available KGE models, we utilize a more generic group structure, module, a generalization notion of vector space. Specifically, under our framework, we introduce a more generic embedding method, ModulE, which projects entities to a module. Following the method of ModulE, we build three instantiating models: ModulE${\mathbb{R},\mathbb{C}}$, ModulE${\mathbb{R},\mathbb{H}}$ and ModulE${\mathbb{H},\mathbb{H}}$, by adopting different module structures. Experimental results show that ModulE${\mathbb{H},\mathbb{H}}$ which embeds entities to a module over non-commutative ring, achieves state-of-the-art performance on multiple benchmark datasets.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (2)
  1. Jingxuan Chai (3 papers)
  2. Guangming Shi (87 papers)
Citations (2)

Summary

We haven't generated a summary for this paper yet.