Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
80 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Learning Emoji Embeddings using Emoji Co-occurrence Network Graph (1806.07785v1)

Published 20 Jun 2018 in cs.SI

Abstract: Usage of emoji in social media platforms has seen a rapid increase over the last few years. Majority of the social media posts are laden with emoji and users often use more than one emoji in a single social media post to express their emotions and to emphasize certain words in a message. Utilizing the emoji co-occurrence can be helpful to understand how emoji are used in social media posts and their meanings in the context of social media posts. In this paper, we investigate whether emoji co-occurrences can be used as a feature to learn emoji embeddings which can be used in many downstream applications such sentiment analysis and emotion identification in social media text. We utilize 147 million tweets which have emojis in them and build an emoji co-occurrence network. Then, we train a network embedding model to embed emojis into a low dimensional vector space. We evaluate our embeddings using sentiment analysis and emoji similarity experiments, and experimental results show that our embeddings outperform the current state-of-the-art results for sentiment analysis tasks.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (2)
  1. Anurag Illendula (6 papers)
  2. Manish Reddy Yedulla (2 papers)
Citations (11)