Trust as an Indicator of Robot Functional and Social Acceptance: An Evaluation of Human Conformance to iCub's Decisions
This paper examines the acceptance of a humanoid robot, the iCub, by human users in both functional and social decision-making tasks under uncertainty. The experiment explores the role of trust as an indicator of robot acceptance, with a focus on conformation—users modifying their decisions to match the robot's. The authors employ a comprehensive method, considering user-related, robot-related, and context-related features that might influence trust.
Study Design and Methodology
The experimental design involved fifty-six adult participants interacting with the iCub robot. The participants confronted two types of tasks:
- Functional tasks: Participants assessed perceptual characteristics of stimuli, such as weight comparison, pitch, and predominant color, requiring objective measurements.
- Social tasks: These tasks included subjective evaluations regarding appropriateness of items within specific social contexts.
The paper utilized the Wizard-of-Oz paradigm, wherein participants believed they were interacting with an autonomous robot, though it was controlled remotely. Participants were exposed to scenarios: collaborative, competitive, and neutral.
Prior to robot interaction, participants completed two questionnaires aimed at assessing their desire for control and attitudes towards social influence of robots. This preparatory phase was designed to probe individual propensities influencing their conformation to the robot's decisions.
Findings
Results indicated that participants were more likely to conform in functional tasks than in social tasks. Only a minor subset trusted the robot's social savvy. This suggests that while functional acceptance based on high precision technical skills was acknowledged, social acceptance was less evident, perhaps due to the innate belief that robots lack subjective experience necessary for social savvy.
Interestingly, those conforming socially did not necessarily conform functionally, challenging the assumption that social trust arises from functional trust. This dichotomy implies that participants view the iCub differently as either a tool with technical expertise or a social entity capable of integrating into social tasks separately.
Implications
These observations underscore the complexities of human-robot trust dynamics, particularly in distinguishing functional savvy from social savvy. The results reflect a nuanced understanding of robot acceptance within varying contexts of uncertainty. Despite the societal push towards robots being dual-functional and socially capable, the paper reveals a potential dichotomy in user perception and expectations.
Future Directions
Future investigations should delve into whether the observed behavior depends on robots being perceived as "socially ignorant" or whether the intrinsic nature of tasks contributes to this trust divide. Additional research should encompass comparative interactions between humans and humanoid robots to illuminate distinctions in trust mechanisms across different agents.
Improving robot design to bridge the acceptance gap between functional and social savvy requires understanding how robots are perceived in terms of capability and reliability. Researchers could benefit from exploring the intersection of task nature, robot appearance, and cultural influences further. Additionally, expanding participant demographics and examining the longitudinal impact of repeated robot interactions could provide deeper insights into evolving trust dynamics.
In summary, this paper contributes significantly to understanding human-robot interaction dynamics and raises compelling inquiries for further exploration into establishing holistic trust in robots as integral partners in everyday tasks.