Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

OneEE: A One-Stage Framework for Fast Overlapping and Nested Event Extraction (2209.02693v1)

Published 6 Sep 2022 in cs.CL

Abstract: Event extraction (EE) is an essential task of information extraction, which aims to extract structured event information from unstructured text. Most prior work focuses on extracting flat events while neglecting overlapped or nested ones. A few models for overlapped and nested EE includes several successive stages to extract event triggers and arguments,which suffer from error propagation. Therefore, we design a simple yet effective tagging scheme and model to formulate EE as word-word relation recognition, called OneEE. The relations between trigger or argument words are simultaneously recognized in one stage with parallel grid tagging, thus yielding a very fast event extraction speed. The model is equipped with an adaptive event fusion module to generate event-aware representations and a distance-aware predictor to integrate relative distance information for word-word relation recognition, which are empirically demonstrated to be effective mechanisms. Experiments on 3 overlapped and nested EE benchmarks, namely FewFC, Genia11, and Genia13, show that OneEE achieves the state-of-the-art (SOTA) results. Moreover, the inference speed of OneEE is faster than those of baselines in the same condition, and can be further substantially improved since it supports parallel inference.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (9)
  1. Hu Cao (17 papers)
  2. Jingye Li (15 papers)
  3. Fangfang Su (3 papers)
  4. Fei Li (233 papers)
  5. Hao Fei (105 papers)
  6. Shengqiong Wu (36 papers)
  7. Bobo Li (23 papers)
  8. Liang Zhao (353 papers)
  9. Donghong Ji (50 papers)
Citations (30)

Summary

We haven't generated a summary for this paper yet.