Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Position-based Contributive Embeddings for Aspect-Based Sentiment Analysis (2108.05098v2)

Published 11 Aug 2021 in cs.CL

Abstract: Aspect-based sentiment analysis (ABSA), exploring sentiment polarity of aspect-given sentence, is a fine-grained task in the field of nature language processing. Previously researches typically tend to predict polarity based on the meaning of aspect and opinions. However, those approaches mainly focus on considering relations implicitly at the word level, ignore the historical impact of other positional words when the aspect appears in a certain position. Therefore, we propose a Position-based Contributive Embeddings (PosCE) to highlight the historical reference to special position aspect. Contribution of each positional words to the polarity is similar to the process of fairly distributing gains to several actors working in coalition (game theory). Therefore, we quote from the method of Shapley Value and finally gain PosCE to enhance the aspect-based representation for ABSA task. Furthermore, the PosCE can also be used for improving performances on multimodal ABSA task. Extensive experiments on both text and text-audio level using SemEval dataset show that the mainstream models advance performance in accuracy and F1 (increase 2.82% and 4.21% on average respectively) by applying our PosCE.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Zijian Zhang (125 papers)
  2. Chenxin Zhang (1 paper)
  3. Jiangfeng Li (1 paper)
  4. Qinpei Zhao (2 papers)

Summary

We haven't generated a summary for this paper yet.