Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Shallow Domain Adaptive Embeddings for Sentiment Analysis (1908.06082v1)

Published 16 Aug 2019 in cs.IR, cs.CL, and cs.LG

Abstract: This paper proposes a way to improve the performance of existing algorithms for text classification in domains with strong language semantics. We propose a domain adaptation layer learns weights to combine a generic and a domain specific (DS) word embedding into a domain adapted (DA) embedding. The DA word embeddings are then used as inputs to a generic encoder + classifier framework to perform a downstream task such as classification. This adaptation layer is particularly suited to datasets that are modest in size, and which are, therefore, not ideal candidates for (re)training a deep neural network architecture. Results on binary and multi-class classification tasks using popular encoder architectures, including current state-of-the-art methods (with and without the shallow adaptation layer) show the effectiveness of the proposed approach.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Prathusha K Sarma (4 papers)
  2. Yingyu Liang (107 papers)
  3. William A Sethares (13 papers)
Citations (8)

Summary

We haven't generated a summary for this paper yet.