Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Domain Representation for Knowledge Graph Embedding (1903.10716v4)

Published 26 Mar 2019 in cs.AI and cs.CL

Abstract: Embedding entities and relations into a continuous multi-dimensional vector space have become the dominant method for knowledge graph embedding in representation learning. However, most existing models ignore to represent hierarchical knowledge, such as the similarities and dissimilarities of entities in one domain. We proposed to learn a Domain Representations over existing knowledge graph embedding models, such that entities that have similar attributes are organized into the same domain. Such hierarchical knowledge of domains can give further evidence in link prediction. Experimental results show that domain embeddings give a significant improvement over the most recent state-of-art baseline knowledge graph embedding models.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Cunxiang Wang (31 papers)
  2. Feiliang Ren (18 papers)
  3. Zhichao Lin (8 papers)
  4. Chenxv Zhao (1 paper)
  5. Tian Xie (77 papers)
  6. Yue Zhang (620 papers)
Citations (1)

Summary

We haven't generated a summary for this paper yet.