Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
167 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Group Representation Theory for Knowledge Graph Embedding (1909.05100v2)

Published 11 Sep 2019 in cs.LG, cs.AI, and math.RT

Abstract: Knowledge graph embedding has recently become a popular way to model relations and infer missing links. In this paper, we present a group theoretical perspective of knowledge graph embedding, connecting previous methods with different group actions. Furthermore, by utilizing Schur's lemma from group representation theory, we show that the state of the art embedding method RotatE can model relations from any finite Abelian group.

Citations (5)

Summary

We haven't generated a summary for this paper yet.