2000 character limit reached
Evaluating the Impact of a Hierarchical Discourse Representation on Entity Coreference Resolution Performance (2104.10215v1)
Published 20 Apr 2021 in cs.CL, cs.AI, and cs.LG
Abstract: Recent work on entity coreference resolution (CR) follows current trends in Deep Learning applied to embeddings and relatively simple task-related features. SOTA models do not make use of hierarchical representations of discourse structure. In this work, we leverage automatically constructed discourse parse trees within a neural approach and demonstrate a significant improvement on two benchmark entity coreference-resolution datasets. We explore how the impact varies depending upon the type of mention.
- Sopan Khosla (9 papers)
- James Fiacco (1 paper)
- Carolyn Rose (32 papers)