Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Incorporating Dynamic Semantics into Pre-Trained Language Model for Aspect-based Sentiment Analysis (2203.16369v2)

Published 30 Mar 2022 in cs.CL and cs.AI

Abstract: Aspect-based sentiment analysis (ABSA) predicts sentiment polarity towards a specific aspect in the given sentence. While pre-trained LLMs such as BERT have achieved great success, incorporating dynamic semantic changes into ABSA remains challenging. To this end, in this paper, we propose to address this problem by Dynamic Re-weighting BERT (DR-BERT), a novel method designed to learn dynamic aspect-oriented semantics for ABSA. Specifically, we first take the Stack-BERT layers as a primary encoder to grasp the overall semantic of the sentence and then fine-tune it by incorporating a lightweight Dynamic Re-weighting Adapter (DRA). Note that the DRA can pay close attention to a small region of the sentences at each step and re-weigh the vitally important words for better aspect-aware sentiment understanding. Finally, experimental results on three benchmark datasets demonstrate the effectiveness and the rationality of our proposed model and provide good interpretable insights for future semantic modeling.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Kai Zhang (542 papers)
  2. Kun Zhang (353 papers)
  3. Mengdi Zhang (37 papers)
  4. Hongke Zhao (24 papers)
  5. Qi Liu (485 papers)
  6. Wei Wu (482 papers)
  7. Enhong Chen (242 papers)
Citations (48)