Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Contextualized Word Embeddings Enhanced Event Temporal Relation Extraction for Story Understanding (1904.11942v1)

Published 26 Apr 2019 in cs.CL

Abstract: Learning causal and temporal relationships between events is an important step towards deeper story and commonsense understanding. Though there are abundant datasets annotated with event relations for story comprehension, many have no empirical results associated with them. In this work, we establish strong baselines for event temporal relation extraction on two under-explored story narrative datasets: Richer Event Description (RED) and Causal and Temporal Relation Scheme (CaTeRS). To the best of our knowledge, these are the first results reported on these two datasets. We demonstrate that neural network-based models can outperform some strong traditional linguistic feature-based models. We also conduct comparative studies to show the contribution of adopting contextualized word embeddings (BERT) for event temporal relation extraction from stories. Detailed analyses are offered to better understand the results.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Rujun Han (19 papers)
  2. Mengyue Liang (1 paper)
  3. Bashar Alhafni (21 papers)
  4. Nanyun Peng (205 papers)
Citations (20)

Summary

We haven't generated a summary for this paper yet.