Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Interpreting Knowledge Graph Relation Representation from Word Embeddings (1909.11611v2)

Published 25 Sep 2019 in cs.LG and stat.ML

Abstract: Many models learn representations of knowledge graph data by exploiting its low-rank latent structure, encoding known relations between entities and enabling unknown facts to be inferred. To predict whether a relation holds between entities, embeddings are typically compared in the latent space following a relation-specific mapping. Whilst their predictive performance has steadily improved, how such models capture the underlying latent structure of semantic information remains unexplained. Building on recent theoretical understanding of word embeddings, we categorise knowledge graph relations into three types and for each derive explicit requirements of their representations. We show that empirical properties of relation representations and the relative performance of leading knowledge graph representation methods are justified by our analysis.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Carl Allen (16 papers)
  2. Ivana Balažević (15 papers)
  3. Timothy Hospedales (101 papers)
Citations (9)

Summary

We haven't generated a summary for this paper yet.