2000 character limit reached
Studying Attention Models in Sentiment Attitude Extraction Task (2006.11605v1)
Published 20 Jun 2020 in cs.CL
Abstract: In the sentiment attitude extraction task, the aim is to identify <<attitudes>> -- sentiment relations between entities mentioned in text. In this paper, we provide a study on attention-based context encoders in the sentiment attitude extraction task. For this task, we adapt attentive context encoders of two types: (i) feature-based; (ii) self-based. Our experiments with a corpus of Russian analytical texts RuSentRel illustrate that the models trained with attentive encoders outperform ones that were trained without them and achieve 1.5-5.9% increase by F1. We also provide the analysis of attention weight distributions in dependence on the term type.
- Nicolay Rusnachenko (8 papers)
- Natalia Loukachevitch (25 papers)