Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Global Relation Embedding for Relation Extraction (1704.05958v2)

Published 19 Apr 2017 in cs.CL

Abstract: We study the problem of textual relation embedding with distant supervision. To combat the wrong labeling problem of distant supervision, we propose to embed textual relations with global statistics of relations, i.e., the co-occurrence statistics of textual and knowledge base relations collected from the entire corpus. This approach turns out to be more robust to the training noise introduced by distant supervision. On a popular relation extraction dataset, we show that the learned textual relation embedding can be used to augment existing relation extraction models and significantly improve their performance. Most remarkably, for the top 1,000 relational facts discovered by the best existing model, the precision can be improved from 83.9% to 89.3%.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Yu Su (138 papers)
  2. Honglei Liu (10 papers)
  3. Semih Yavuz (43 papers)
  4. Izzeddin Gur (23 papers)
  5. Huan Sun (88 papers)
  6. Xifeng Yan (52 papers)
Citations (28)

Summary

We haven't generated a summary for this paper yet.