Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Graph Transformer Networks with Syntactic and Semantic Structures for Event Argument Extraction (2010.13391v1)

Published 26 Oct 2020 in cs.CL

Abstract: The goal of Event Argument Extraction (EAE) is to find the role of each entity mention for a given event trigger word. It has been shown in the previous works that the syntactic structures of the sentences are helpful for the deep learning models for EAE. However, a major problem in such prior works is that they fail to exploit the semantic structures of the sentences to induce effective representations for EAE. Consequently, in this work, we propose a novel model for EAE that exploits both syntactic and semantic structures of the sentences with the Graph Transformer Networks (GTNs) to learn more effective sentence structures for EAE. In addition, we introduce a novel inductive bias based on information bottleneck to improve generalization of the EAE models. Extensive experiments are performed to demonstrate the benefits of the proposed model, leading to state-of-the-art performance for EAE on standard datasets.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Amir Pouran Ben Veyseh (20 papers)
  2. Tuan Ngo Nguyen (5 papers)
  3. Thien Huu Nguyen (61 papers)
Citations (46)

Summary

We haven't generated a summary for this paper yet.