Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Temporal Relation Extraction with a Graph-Based Deep Biaffine Attention Model (2201.06125v1)

Published 16 Jan 2022 in cs.CL and cs.LG

Abstract: Temporal information extraction plays a critical role in natural language understanding. Previous systems have incorporated advanced neural LLMs and have successfully enhanced the accuracy of temporal information extraction tasks. However, these systems have two major shortcomings. First, they fail to make use of the two-sided nature of temporal relations in prediction. Second, they involve non-parallelizable pipelines in inference process that bring little performance gain. To this end, we propose a novel temporal information extraction model based on deep biaffine attention to extract temporal relationships between events in unstructured text efficiently and accurately. Our model is performant because we perform relation extraction tasks directly instead of considering event annotation as a prerequisite of relation extraction. Moreover, our architecture uses Multilayer Perceptrons (MLP) with biaffine attention to predict arcs and relation labels separately, improving relation detecting accuracy by exploiting the two-sided nature of temporal relationships. We experimentally demonstrate that our model achieves state-of-the-art performance in temporal relation extraction.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Bo-Ying Su (2 papers)
  2. Shang-Ling Hsu (6 papers)
  3. Kuan-Yin Lai (2 papers)
  4. Amarnath Gupta (17 papers)
Citations (3)

Summary

We haven't generated a summary for this paper yet.