Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Exploiting Adaptive Contextual Masking for Aspect-Based Sentiment Analysis (2402.13722v1)

Published 21 Feb 2024 in cs.CL

Abstract: Aspect-Based Sentiment Analysis (ABSA) is a fine-grained linguistics problem that entails the extraction of multifaceted aspects, opinions, and sentiments from the given text. Both standalone and compound ABSA tasks have been extensively used in the literature to examine the nuanced information present in online reviews and social media posts. Current ABSA methods often rely on static hyperparameters for attention-masking mechanisms, which can struggle with context adaptation and may overlook the unique relevance of words in varied situations. This leads to challenges in accurately analyzing complex sentences containing multiple aspects with differing sentiments. In this work, we present adaptive masking methods that remove irrelevant tokens based on context to assist in Aspect Term Extraction and Aspect Sentiment Classification subtasks of ABSA. We show with our experiments that the proposed methods outperform the baseline methods in terms of accuracy and F1 scores on four benchmark online review datasets. Further, we show that the proposed methods can be extended with multiple adaptations and demonstrate a qualitative analysis of the proposed approach using sample text for aspect term extraction.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (24)
  1. A survey on aspect-based sentiment analysis: Tasks, methods, and challenges. IEEE TKDE, pages 11019–11038, 2022a.
  2. Modelling context and syntactical features for aspect-based sentiment analysis. In Dan Jurafsky, Joyce Chai, Natalie Schluter, and Joel Tetreault, editors, ACL, pages 3211–3220, July 2020.
  3. Aspect-based sentiment analysis through edu-level attentions. In PAKDD, pages 156–168. Springer, 2022.
  4. Attention is all you need. Advances in neural information processing systems, 30, 2017.
  5. Unrestricted attention may not be all you need–masked attention mechanism focuses better on relevant parts in aspect-based sentiment analysis. IEEE Access, 10:8518–8528, 2022.
  6. Lcf: A local context focus mechanism for aspect-based sentiment classification. Applied Sciences, 9(16):3389, 2019.
  7. Te Lin and Inwhee Joe. An adaptive masked attention mechanism to act on the local text in a global context for aspect-based sentiment analysis. IEEE Access, pages 43055–43066, 2023.
  8. A joint training dual-mrc framework for aspect based sentiment analysis. In AAAI, volume 35, pages 13543–13551, 2021.
  9. Learning span-level interactions for aspect sentiment triplet extraction. In Chengqing Zong, Fei Xia, Wenjie Li, and Roberto Navigli, editors, ACL, pages 4755–4766, August 2021.
  10. A unified generative framework for aspect-based sentiment analysis. In Chengqing Zong, Fei Xia, Wenjie Li, and Roberto Navigli, editors, ACL, pages 2416–2429, August 2021.
  11. Towards generative aspect-based sentiment analysis. In ACL, pages 504–510, 2021.
  12. Enhancing aspect term extraction with soft prototypes. In EMNLP, pages 2107–2117. ACL, 2020.
  13. Learn from syntax: Improving pair-wise aspect and opinion terms extraction with rich syntactic knowledge. In Zhi-Hua Zhou, editor, Proceedings of IJCAI, pages 3957–3963, 8 2021.
  14. Sk2: Integrating implicit sentiment knowledge and explicit syntax knowledge for aspect-based sentiment analysis. CIKM ’22, page 1114–1123. ACM, 2022. ISBN 9781450392365.
  15. A span-level bidirectional network for aspect sentiment triplet extraction. In EMNLP, pages 4300–4309. ACL, 2022.
  16. Syntactic and semantic enhanced graph convolutional network for aspect-based sentiment analysis. In ACL, pages 4916–4925, 2022b.
  17. Aspect-based sentiment analysis with type-aware graph convolutional networks and layer ensemble. In NAACL, pages 2910–2922, 2021.
  18. Fine-grained opinion mining with recurrent neural networks and word embeddings. In EMNLP, pages 1433–1443, 2015.
  19. Jacob Devlin Ming-Wei Chang Kenton and Lee Kristina Toutanova. Bert: Pre-training of deep bidirectional transformers for language understanding. In NAACL, pages 4171–4186, 2019.
  20. Adaptive attention span in transformers. In ACL, pages 331–335, 2019.
  21. Amom: adaptive masking over masking for conditional masked language model. In AAAI, volume 37, pages 13789–13797, 2023.
  22. Dependency-tree based convolutional neural networks for aspect term extraction. In Advances in Knowledge Discovery and Data Mining: 21st Pacific-Asia Conference, PAKDD 2017, Jeju, South Korea, May 23-26, 2017, Proceedings, Part II 21, pages 350–362. Springer, 2017.
  23. An interactive multi-task learning network for end-to-end aspect-based sentiment analysis. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 504–515, 2019.
  24. Multi-head self-attention based gated graph convolutional networks for aspect-based sentiment classification. Multimedia Tools and Applications, pages 1–20, 2022.
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. S M Rafiuddin (1 paper)
  2. Mohammed Rakib (7 papers)
  3. Sadia Kamal (7 papers)
  4. Arunkumar Bagavathi (18 papers)
Citations (2)