Papers
Topics
Authors
Recent
Search
2000 character limit reached

Self context-aware emotion perception on human-robot interaction

Published 18 Jan 2024 in cs.HC and cs.AI | (2401.10946v1)

Abstract: Emotion recognition plays a crucial role in various domains of human-robot interaction. In long-term interactions with humans, robots need to respond continuously and accurately, however, the mainstream emotion recognition methods mostly focus on short-term emotion recognition, disregarding the context in which emotions are perceived. Humans consider that contextual information and different contexts can lead to completely different emotional expressions. In this paper, we introduce self context-aware model (SCAM) that employs a two-dimensional emotion coordinate system for anchoring and re-labeling distinct emotions. Simultaneously, it incorporates its distinctive information retention structure and contextual loss. This approach has yielded significant improvements across audio, video, and multimodal. In the auditory modality, there has been a notable enhancement in accuracy, rising from 63.10% to 72.46%. Similarly, the visual modality has demonstrated improved accuracy, increasing from 77.03% to 80.82%. In the multimodal, accuracy has experienced an elevation from 77.48% to 78.93%. In the future, we will validate the reliability and usability of SCAM on robots through psychology experiments.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (16)
  1. A Survey of Robots in Healthcare. Technologies, 9(1):8, 2021.
  2. Multimodal integration of emotional signals from voice, body, and context: Effects of (in) congruence on emotion recognition and attitudes towards robots. International Journal of Social Robotics, 11, 555-573, 2019.
  3. Emotion recognition in conversation: Research challenges, datasets, and recent advances. IEEE Access, 7, 100943-100953, 2019.
  4. The perception of changing emotion expressions. Cognition & Emotion, 26(7), 1273-1300, 2012.
  5. Emotional chatting machine: Emotional conversation generation with internal and external memory. In Proceedings of the AAAI Conference on Artificial Intelligence, Vol. 32, No. 1, April 2018.
  6. EEG based emotion recognition: A tutorial and review. ACM Computing Surveys, 55(4), 1-57.
  7. Automated emotion recognition: Current trends and future perspectives. Computer methods and programs in biomedicine, 215, 106646, 2022.
  8. Investigating EEG-based functional connectivity patterns for multimodal emotion recognition. Journal of neural engineering, 19(1), 016012, 2022.
  9. Russell, J. A. (1980). A circumplex model of affect. Journal of personality and social psychology, 39(6), 1161.
  10. Survey on bimodal speech emotion recognition from acoustic and linguistic information fusion. Speech Communication, 140, 11-28, 2022.
  11. Attention driven fusion for multi-modal emotion recognition. In ICASSP 2020-2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp. 3227-3231, May 2020. IEEE.
  12. Multi-View Learning for Speech Emotion Recognition with Categorical Emotion, Categorical Sentiment, and Dimensional Scores. In ICASSP 2023-2023 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp. 1-5, June 2023. IEEE.
  13. IEMOCAP: Interactive emotional dyadic motion capture database. Language Resources and Evaluation, 42, 335-359, 2008.
  14. Efficient speech emotion recognition using multi-scale CNN and attention. In ICASSP 2021-2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp. 3020-3024, June 2021. IEEE.
  15. Multi-modal attention for speech emotion recognition. arXiv preprint arXiv:2009.04107, 2020.
  16. AV-ITN: A Method of Multimodal Video Emotional Content Analysis. In 2022 IEEE Conference on Telecommunications, Optics and Computer Science (TOCS), pp. 84-87, December 2022. IEEE.
Citations (2)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 2 tweets with 0 likes about this paper.