Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 167 tok/s
Gemini 2.5 Pro 53 tok/s Pro
GPT-5 Medium 31 tok/s Pro
GPT-5 High 31 tok/s Pro
GPT-4o 106 tok/s Pro
Kimi K2 187 tok/s Pro
GPT OSS 120B 443 tok/s Pro
Claude Sonnet 4.5 37 tok/s Pro
2000 character limit reached

EmoACT: a Framework to Embed Emotions into Artificial Agents Based on Affect Control Theory (2504.12125v1)

Published 16 Apr 2025 in cs.RO

Abstract: As robots and artificial agents become increasingly integrated into daily life, enhancing their ability to interact with humans is essential. Emotions, which play a crucial role in human interactions, can improve the naturalness and transparency of human-robot interactions (HRI) when embodied in artificial agents. This study aims to employ Affect Control Theory (ACT), a psychological model of emotions deeply rooted in interaction, for the generation of synthetic emotions. A platform-agnostic framework inspired by ACT was developed and implemented in a humanoid robot to assess its impact on human perception. Results show that the frequency of emotional displays impacts how users perceive the robot. Moreover, appropriate emotional expressions seem to enhance the robot's perceived emotional and cognitive agency. The findings suggest that ACT can be successfully employed to embed synthetic emotions into robots, resulting in effective human-robot interactions, where the robot is perceived more as a social agent than merely a machine.

Summary

  • The paper demonstrates that using Affect Control Theory effectively converts identity-impression discrepancies into synthetic emotions in artificial agents.
  • The study's experiments reveal that high-frequency emotional displays notably enhance the perceived emotional agency of robots.
  • The framework integrates impression detection, emotion generation, and emotion expression to improve dynamic human-robot interaction.

EmoACT: Embedding Emotions in Artificial Agents with Affect Control Theory

Introduction

The integration of emotions into robots and artificial agents has garnered interest due to its potential to enhance human-robot interaction (HRI). Emotions facilitate more natural and transparent interactions, akin to human-human communication. The paper "EmoACT: a Framework to Embed Emotions into Artificial Agents Based on Affect Control Theory" presents a framework leveraging Affect Control Theory (ACT) to embed synthetic emotions into humanoid robots. This essay provides a comprehensive overview of the paper’s theoretical background, proposed architecture, and experimental evaluation, focusing on its methodological rigor and empirical findings.

Affect Control Theory in HRI

ACT suggests that emotions arise from discrepancies between an individual's identity and the perceived impression of others during interactions. These discrepancies generate emotions in a three-dimensional space (EPA): Evaluation, Potency, and Activity. This theory contrasts with Cognitive Appraisal Theories by explicitly linking emotions to social interactions, aligning well with the goals of HRI. By implementing ACT in robotic systems, the paper explores whether artificial agents can embody emotions in a manner that affects human perception.

EmoACT Framework

The EmoACT framework is designed to be platform-independent and comprises several key modules: Impression Detection, Emotion Generation, and Emotion Expression.

  • Impression Detection assesses the user's perception of the robot based on emotional cues, task-related input, and user proximity.
  • Emotion Generation utilizes ACT's equations to convert discrepancies between identity and impression into emotions within the EPA space.
  • Emotion Expression translates emotional states into expressive behaviors using facial and body cues, such as color changes in eye LEDs and animated movements. Figure 1

    Figure 1: EmoACT Framework.

Experimental Design

Two experiments were conducted to evaluate the EmoACT framework's effectiveness:

  1. Experiment 1: Compared a non-emotional robot with one that displays emotions at a low frequency.
  2. Experiment 2: Compared a non-emotional robot with one exhibiting high-frequency emotional expressions.

Participants interacted with the robot in collaborative storytelling scenarios, where the robot's emotional expressiveness was tested under varying conditions. Figure 2

Figure 2: Experiment interaction.

Results

The experiments demonstrate that ACT-based emotional expressions in robots improve users' perceptions of the robot's emotional and cognitive capabilities. The high-frequency emotional display significantly enhanced the perceived emotional agency compared to the control group. However, the improvements in anthropomorphism, animacy, and intelligence were less pronounced, suggesting that frequency plays a crucial role in successful emotion conveyance. Figure 3

Figure 3: Average score and confidence interval of the Agency Experience questions for the emotional and basic robot in Case Experiment~2.

The paper's findings suggest that while emotions can enhance the perception of robots, their effective portrayal depends on the frequency and type of emotional cues used. The high-frequency emotional display robot was perceived as more emotionally intelligent, indicating the importance of dynamic expressiveness in developing social robots.

Discussion and Conclusion

The research shows promising results for embedding synthetic emotions using ACT, providing a novel approach to HRI. EmoACT enhances the perceived emotional agency of robots, highlighting the importance of expressiveness frequency in emotion recognition. Future work should explore integrating other psychological theories, including personality traits, to enrich the framework. Despite some limitations, such as the focus on positive identities and instantaneous emotion generation, the framework successfully demonstrates the potential of ACT in improving human-robot interactions.

In conclusion, this paper contributes significantly to the field of social robotics by demonstrating that ACT can be effectively employed to embed synthetic emotions into artificial agents, thus advancing the understanding of emotion-driven human-robot interaction.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets

This paper has been mentioned in 1 tweet and received 0 likes.

Upgrade to Pro to view all of the tweets about this paper: