Multilingual Generative Language Models for Zero-Shot Cross-Lingual Event Argument Extraction (2203.08308v1)
Abstract: We present a study on leveraging multilingual pre-trained generative LLMs for zero-shot cross-lingual event argument extraction (EAE). By formulating EAE as a language generation task, our method effectively encodes event structures and captures the dependencies between arguments. We design language-agnostic templates to represent the event argument structures, which are compatible with any language, hence facilitating the cross-lingual transfer. Our proposed model finetunes multilingual pre-trained generative LLMs to generate sentences that fill in the language-agnostic template with arguments extracted from the input passage. The model is trained on source languages and is then directly applied to target languages for event argument extraction. Experiments demonstrate that the proposed model outperforms the current state-of-the-art models on zero-shot cross-lingual EAE. Comprehensive studies and error analyses are presented to better understand the advantages and the current limitations of using generative LLMs for zero-shot cross-lingual transfer EAE.
- Kuan-Hao Huang (33 papers)
- I-Hung Hsu (21 papers)
- Premkumar Natarajan (24 papers)
- Kai-Wei Chang (292 papers)
- Nanyun Peng (205 papers)