Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Revisiting the Negative Data of Distantly Supervised Relation Extraction (2105.10158v1)

Published 21 May 2021 in cs.CL and cs.AI

Abstract: Distantly supervision automatically generates plenty of training samples for relation extraction. However, it also incurs two major problems: noisy labels and imbalanced training data. Previous works focus more on reducing wrongly labeled relations (false positives) while few explore the missing relations that are caused by incompleteness of knowledge base (false negatives). Furthermore, the quantity of negative labels overwhelmingly surpasses the positive ones in previous problem formulations. In this paper, we first provide a thorough analysis of the above challenges caused by negative data. Next, we formulate the problem of relation extraction into as a positive unlabeled learning task to alleviate false negative problem. Thirdly, we propose a pipeline approach, dubbed \textsc{ReRe}, that performs sentence-level relation detection then subject/object extraction to achieve sample-efficient training. Experimental results show that the proposed method consistently outperforms existing approaches and remains excellent performance even learned with a large quantity of false positive samples.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Chenhao Xie (10 papers)
  2. Jiaqing Liang (62 papers)
  3. Jingping Liu (18 papers)
  4. Chengsong Huang (11 papers)
  5. Wenhao Huang (99 papers)
  6. Yanghua Xiao (151 papers)
Citations (24)

Summary

We haven't generated a summary for this paper yet.