Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

The Impact of Robots' Facial Emotional Expressions on Light Physical Exercises (2312.02390v1)

Published 4 Dec 2023 in cs.RO

Abstract: To address the global challenge of population aging, our goal is to enhance successful aging through the introduction of robots capable of assisting in daily physical activities and promoting light exercises, which would enhance the cognitive and physical well-being of older adults. Previous studies have shown that facial expressions can increase engagement when interacting with robots. This study aims to investigate how older adults perceive and interact with a robot capable of displaying facial emotions while performing a physical exercise task together. We employed a collaborative robotic arm with a flat panel screen to encourage physical exercise across three different facial emotion conditions. We ran the experiment with older adults aged between 66 and 88. Our findings suggest that individuals perceive robots exhibiting facial expressions as less competent than those without such expressions. Additionally, the presence of facial expressions does not appear to significantly impact participants' levels of engagement, unlike other state-of-the-art studies. This observation is likely linked to our study's emphasis on collaborative physical human-robot interaction (pHRI) applications, as opposed to socially oriented pHRI applications. Additionally, we foresee a requirement for more suitable non-verbal social behavior to effectively enhance participants' engagement levels.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (2)
  1. Nourhan Abdulazeem (1 paper)
  2. Yue Hu (220 papers)

Summary

Introduction

A paper from the University of Waterloo investigates how robots that assist in daily activities and promote light exercises might be perceived by senior adults, especially when the robots display facial emotional expressions. This area of paper is particularly important in the context of an aging global population and the increasing interest in technology that can support older adults in maintaining their independence and well-being.

Perception of Robots

In their experiment, the researchers used a collaborative robotic arm called Sawyer which was programmed to display varying facial expressions on a screen. Participants in the paper were assigned to one of three conditions: robots with no facial expression, robots with a constant happy expression, and robots with expressions that changed in response to the user's interaction with the robot. The surprising result was that robots with facial expressions were perceived as less competent than their non-expressive counterparts. Additionally, no change in user engagement could be observed whether the robots' expressions were responsive or unresponsive to the users' actions.

Engagement Levels

The paper also measured engagement levels in both objective and subjective terms. Objective engagement was evaluated based on the participants' performance during the interaction, while subjective engagement was measured through a questionnaire where participants reported their level of engagement. Despite variations in the robot's facial expressions, there appeared to be no significant impact on the level of engagement.

Social-Collaborative Interactions

It was indicated that the role facial expressions play in human-robot interaction (HRI) may massively depend on whether the interaction is more socially or collaboratively oriented. This differentiation suggests that while facial expressions are often thought to enhance the social aspects of humanoid robots, their impact might be less beneficial, or even counterproductive, when applied to robots designed for more practical physical tasks.

Future Directions

The authors suggest that further research should explore whether modifications to the robot's facial features—such as the addition of cheeks or eyelids—could potentially improve the perceived intelligence and appeal of such robots. Given the limited sample mentioned, future studies could include a broader participant pool, encompassing different age groups to generalize the findings more effectively. Also, further exploration into the relationships discovered through correlation analysis could provide deeper insights into the factors that influence human-robot engagement.

In conclusion, while robots with facial emotional expressions were perceived as less competent by older adults, the expressions did not influence engagement levels during light physical exercise interactions. This research contributes to our understanding of how to better design assistive robots for older adults' physical activities, suggesting that perhaps other social behaviors beyond facial expressions should be considered for enhancing user engagement in practical HRI scenarios.