2000 character limit reached
How to best use Syntax in Semantic Role Labelling (1906.00266v1)
Published 1 Jun 2019 in cs.CL
Abstract: There are many different ways in which external information might be used in an NLP task. This paper investigates how external syntactic information can be used most effectively in the Semantic Role Labeling (SRL) task. We evaluate three different ways of encoding syntactic parses and three different ways of injecting them into a state-of-the-art neural ELMo-based SRL sequence labelling model. We show that using a constituency representation as input features improves performance the most, achieving a new state-of-the-art for non-ensemble SRL models on the in-domain CoNLL'05 and CoNLL'12 benchmarks.
- Yufei Wang (141 papers)
- Mark Johnson (46 papers)
- Stephen Wan (10 papers)
- Yifang Sun (8 papers)
- Wei Wang (1793 papers)