Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Within-Document Event Coreference with BERT-Based Contextualized Representations (2102.09600v2)

Published 15 Feb 2021 in cs.CL, cs.AI, and cs.IR

Abstract: Event coreference continues to be a challenging problem in information extraction. With the absence of any external knowledge bases for events, coreference becomes a clustering task that relies on effective representations of the context in which event mentions appear. Recent advances in contextualized language representations have proven successful in many tasks, however, their use in event linking been limited. Here we present a three part approach that (1) uses representations derived from a pretrained BERT model to (2) train a neural classifier to (3) drive a simple clustering algorithm to create coreference chains. We achieve state of the art results with this model on two standard datasets for within-document event coreference task and establish a new standard on a third newer dataset.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (2)
  1. Shafiuddin Rehan Ahmed (10 papers)
  2. James H. Martin (13 papers)

Summary

We haven't generated a summary for this paper yet.