Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Denoising Distantly Supervised Named Entity Recognition via a Hypergeometric Probabilistic Model (2106.09234v1)

Published 17 Jun 2021 in cs.CL

Abstract: Denoising is the essential step for distant supervision based named entity recognition. Previous denoising methods are mostly based on instance-level confidence statistics, which ignore the variety of the underlying noise distribution on different datasets and entity types. This makes them difficult to be adapted to high noise rate settings. In this paper, we propose Hypergeometric Learning (HGL), a denoising algorithm for distantly supervised NER that takes both noise distribution and instance-level confidence into consideration. Specifically, during neural network training, we naturally model the noise samples in each batch following a hypergeometric distribution parameterized by the noise-rate. Then each instance in the batch is regarded as either correct or noisy one according to its label confidence derived from previous training step, as well as the noise distribution in this sampled batch. Experiments show that HGL can effectively denoise the weakly-labeled data retrieved from distant supervision, and therefore results in significant improvements on the trained models.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Wenkai Zhang (15 papers)
  2. Hongyu Lin (94 papers)
  3. Xianpei Han (103 papers)
  4. Le Sun (111 papers)
  5. Huidan Liu (1 paper)
  6. Zhicheng Wei (5 papers)
  7. Nicholas Jing Yuan (22 papers)
Citations (12)

Summary

We haven't generated a summary for this paper yet.