Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Investigating Typed Syntactic Dependencies for Targeted Sentiment Classification Using Graph Attention Neural Network (2002.09685v3)

Published 22 Feb 2020 in cs.CL

Abstract: Targeted sentiment classification predicts the sentiment polarity on given target mentions in input texts. Dominant methods employ neural networks for encoding the input sentence and extracting relations between target mentions and their contexts. Recently, graph neural network has been investigated for integrating dependency syntax for the task, achieving the state-of-the-art results. However, existing methods do not consider dependency label information, which can be intuitively useful. To solve the problem, we investigate a novel relational graph attention network that integrates typed syntactic dependency information. Results on standard benchmarks show that our method can effectively leverage label information for improving targeted sentiment classification performances. Our final model significantly outperforms state-of-the-art syntax-based approaches.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Xuefeng Bai (35 papers)
  2. Pengbo Liu (8 papers)
  3. Yue Zhang (620 papers)
Citations (3)

Summary

We haven't generated a summary for this paper yet.