Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Expanding Subjective Lexicons for Social Media Mining with Embedding Subspaces (1701.00145v2)

Published 31 Dec 2016 in cs.CL

Abstract: Recent approaches for sentiment lexicon induction have capitalized on pre-trained word embeddings that capture latent semantic properties. However, embeddings obtained by optimizing performance of a given task (e.g. predicting contextual words) are sub-optimal for other applications. In this paper, we address this problem by exploiting task-specific representations, induced via embedding sub-space projection. This allows us to expand lexicons describing multiple semantic properties. For each property, our model jointly learns suitable representations and the concomitant predictor. Experiments conducted over multiple subjective lexicons, show that our model outperforms previous work and other baselines; even in low training data regimes. Furthermore, lexicon-based sentiment classifiers built on top of our lexicons outperform similar resources and yield performances comparable to those of supervised models.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Silvio Amir (16 papers)
  2. Wang Ling (21 papers)
  3. Paula C. Carvalho (1 paper)
  4. Mário J. Silva (5 papers)
  5. Rámon Astudillo (1 paper)
Citations (3)

Summary

We haven't generated a summary for this paper yet.