Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

SentiBERT: A Transferable Transformer-Based Architecture for Compositional Sentiment Semantics (2005.04114v4)

Published 8 May 2020 in cs.CL

Abstract: We propose SentiBERT, a variant of BERT that effectively captures compositional sentiment semantics. The model incorporates contextualized representation with binary constituency parse tree to capture semantic composition. Comprehensive experiments demonstrate that SentiBERT achieves competitive performance on phrase-level sentiment classification. We further demonstrate that the sentiment composition learned from the phrase-level annotations on SST can be transferred to other sentiment analysis tasks as well as related tasks, such as emotion classification tasks. Moreover, we conduct ablation studies and design visualization methods to understand SentiBERT. We show that SentiBERT is better than baseline approaches in capturing negation and the contrastive relation and model the compositional sentiment semantics.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Da Yin (35 papers)
  2. Tao Meng (48 papers)
  3. Kai-Wei Chang (292 papers)
Citations (128)

Summary

We haven't generated a summary for this paper yet.