Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Transformer-Based Self-Supervised Learning for Emotion Recognition (2204.05103v2)

Published 8 Apr 2022 in q-bio.NC, cs.AI, cs.LG, and eess.SP

Abstract: In order to exploit representations of time-series signals, such as physiological signals, it is essential that these representations capture relevant information from the whole signal. In this work, we propose to use a Transformer-based model to process electrocardiograms (ECG) for emotion recognition. Attention mechanisms of the Transformer can be used to build contextualized representations for a signal, giving more importance to relevant parts. These representations may then be processed with a fully-connected network to predict emotions. To overcome the relatively small size of datasets with emotional labels, we employ self-supervised learning. We gathered several ECG datasets with no labels of emotion to pre-train our model, which we then fine-tuned for emotion recognition on the AMIGOS dataset. We show that our approach reaches state-of-the-art performances for emotion recognition using ECG signals on AMIGOS. More generally, our experiments show that transformers and pre-training are promising strategies for emotion recognition with physiological signals.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Juan Vazquez-Rodriguez (3 papers)
  2. Grégoire Lefebvre (4 papers)
  3. Julien Cumin (7 papers)
  4. James L. Crowley (6 papers)
Citations (22)

Summary

We haven't generated a summary for this paper yet.