Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Evaluating the Impact of a Hierarchical Discourse Representation on Entity Coreference Resolution Performance (2104.10215v1)

Published 20 Apr 2021 in cs.CL, cs.AI, and cs.LG

Abstract: Recent work on entity coreference resolution (CR) follows current trends in Deep Learning applied to embeddings and relatively simple task-related features. SOTA models do not make use of hierarchical representations of discourse structure. In this work, we leverage automatically constructed discourse parse trees within a neural approach and demonstrate a significant improvement on two benchmark entity coreference-resolution datasets. We explore how the impact varies depending upon the type of mention.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Sopan Khosla (9 papers)
  2. James Fiacco (1 paper)
  3. Carolyn Rose (32 papers)
Citations (3)

Summary

We haven't generated a summary for this paper yet.