Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Towards Consistent Document-level Entity Linking: Joint Models for Entity Linking and Coreference Resolution (2108.13530v3)

Published 30 Aug 2021 in cs.CL

Abstract: We consider the task of document-level entity linking (EL), where it is important to make consistent decisions for entity mentions over the full document jointly. We aim to leverage explicit "connections" among mentions within the document itself: we propose to join the EL task with that of coreference resolution (coref). This is complementary to related works that exploit either (i) implicit document information (e.g., latent relations among entity mentions, or general LLMs) or (ii) connections between the candidate links (e.g, as inferred from the external knowledge base). Specifically, we cluster mentions that are linked via coreference, and enforce a single EL for all of the clustered mentions together. The latter constraint has the added benefit of increased coverage by joining EL candidate lists for the thus clustered mentions. We formulate the coref+EL problem as a structured prediction task over directed trees and use a globally normalized model to solve it. Experimental results on two datasets show a boost of up to +5% F1-score on both coref and EL tasks, compared to their standalone counterparts. For a subset of hard cases, with individual mentions lacking the correct EL in their candidate entity list, we obtain a +50% increase in accuracy.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Klim Zaporojets (14 papers)
  2. Johannes Deleu (29 papers)
  3. Yiwei Jiang (11 papers)
  4. Thomas Demeester (76 papers)
  5. Chris Develder (59 papers)
Citations (10)

Summary

We haven't generated a summary for this paper yet.