Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Cross-Domain Aspect Extraction using Transformers Augmented with Knowledge Graphs (2210.10144v1)

Published 18 Oct 2022 in cs.CL and cs.AI

Abstract: The extraction of aspect terms is a critical step in fine-grained sentiment analysis of text. Existing approaches for this task have yielded impressive results when the training and testing data are from the same domain. However, these methods show a drastic decrease in performance when applied to cross-domain settings where the domain of the testing data differs from that of the training data. To address this lack of extensibility and robustness, we propose a novel approach for automatically constructing domain-specific knowledge graphs that contain information relevant to the identification of aspect terms. We introduce a methodology for injecting information from these knowledge graphs into Transformer models, including two alternative mechanisms for knowledge insertion: via query enrichment and via manipulation of attention patterns. We demonstrate state-of-the-art performance on benchmark datasets for cross-domain aspect term extraction using our approach and investigate how the amount of external knowledge available to the Transformer impacts model performance.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (8)
  1. Phillip Howard (28 papers)
  2. Arden Ma (2 papers)
  3. Vasudev Lal (44 papers)
  4. Ana Paula Simoes (1 paper)
  5. Daniel Korat (9 papers)
  6. Oren Pereg (11 papers)
  7. Moshe Wasserblat (22 papers)
  8. Gadi Singer (4 papers)
Citations (9)

Summary

We haven't generated a summary for this paper yet.