Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Temporal Information Extraction for Question Answering Using Syntactic Dependencies in an LSTM-based Architecture (1703.05851v2)

Published 17 Mar 2017 in cs.IR and cs.CL

Abstract: In this paper, we propose to use a set of simple, uniform in architecture LSTM-based models to recover different kinds of temporal relations from text. Using the shortest dependency path between entities as input, the same architecture is used to extract intra-sentence, cross-sentence, and document creation time relations. A "double-checking" technique reverses entity pairs in classification, boosting the recall of positive cases and reducing misclassifications between opposite classes. An efficient pruning algorithm resolves conflicts globally. Evaluated on QA-TempEval (SemEval2015 Task 5), our proposed technique outperforms state-of-the-art methods by a large margin.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Yuanliang Meng (5 papers)
  2. Anna Rumshisky (42 papers)
  3. Alexey Romanov (13 papers)
Citations (50)

Summary

We haven't generated a summary for this paper yet.