Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Capturing Event Argument Interaction via A Bi-Directional Entity-Level Recurrent Decoder (2107.00189v1)

Published 1 Jul 2021 in cs.CL

Abstract: Capturing interactions among event arguments is an essential step towards robust event argument extraction (EAE). However, existing efforts in this direction suffer from two limitations: 1) The argument role type information of contextual entities is mainly utilized as training signals, ignoring the potential merits of directly adopting it as semantically rich input features; 2) The argument-level sequential semantics, which implies the overall distribution pattern of argument roles over an event mention, is not well characterized. To tackle the above two bottlenecks, we formalize EAE as a Seq2Seq-like learning problem for the first time, where a sentence with a specific event trigger is mapped to a sequence of event argument roles. A neural architecture with a novel Bi-directional Entity-level Recurrent Decoder (BERD) is proposed to generate argument roles by incorporating contextual entities' argument role predictions, like a word-by-word text generation process, thereby distinguishing implicit argument distribution patterns within an event more accurately.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Xiangyu Xi (8 papers)
  2. Wei Ye (110 papers)
  3. Shikun Zhang (82 papers)
  4. Quanxiu Wang (4 papers)
  5. Huixing Jiang (11 papers)
  6. Wei Wu (481 papers)
Citations (24)