Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Transformer-based joint-encoding for Emotion Recognition and Sentiment Analysis (2006.15955v1)

Published 29 Jun 2020 in cs.CL, cs.HC, and cs.LG

Abstract: Understanding expressed sentiment and emotions are two crucial factors in human multimodal language. This paper describes a Transformer-based joint-encoding (TBJE) for the task of Emotion Recognition and Sentiment Analysis. In addition to use the Transformer architecture, our approach relies on a modular co-attention and a glimpse layer to jointly encode one or more modalities. The proposed solution has also been submitted to the ACL20: Second Grand-Challenge on Multimodal Language to be evaluated on the CMU-MOSEI dataset. The code to replicate the presented experiments is open-source: https://github.com/jbdel/MOSEI_UMONS.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Jean-Benoit Delbrouck (29 papers)
  2. Noé Tits (16 papers)
  3. Mathilde Brousmiche (3 papers)
  4. Stéphane Dupont (21 papers)
Citations (102)

Summary

We haven't generated a summary for this paper yet.