Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Revisiting Joint Modeling of Cross-document Entity and Event Coreference Resolution (1906.01753v1)

Published 4 Jun 2019 in cs.CL

Abstract: Recognizing coreferring events and entities across multiple texts is crucial for many NLP applications. Despite the task's importance, research focus was given mostly to within-document entity coreference, with rather little attention to the other variants. We propose a neural architecture for cross-document coreference resolution. Inspired by Lee et al (2012), we jointly model entity and event coreference. We represent an event (entity) mention using its lexical span, surrounding context, and relation to entity (event) mentions via predicate-arguments structures. Our model outperforms the previous state-of-the-art event coreference model on ECB+, while providing the first entity coreference results on this corpus. Our analysis confirms that all our representation elements, including the mention span itself, its context, and the relation to other mentions contribute to the model's success.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Shany Barhom (1 paper)
  2. Vered Shwartz (49 papers)
  3. Alon Eirew (11 papers)
  4. Michael Bugert (2 papers)
  5. Nils Reimers (25 papers)
  6. Ido Dagan (72 papers)
Citations (91)

Summary

We haven't generated a summary for this paper yet.