Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Trust as indicator of robot functional and social acceptance. An experimental study on user conformation to the iCub's answers (1510.03678v1)

Published 13 Oct 2015 in cs.RO, cs.CY, and cs.HC

Abstract: To investigate the functional and social acceptance of a humanoid robot, we carried out an experimental study with 56 adult participants and the iCub robot. Trust in the robot has been considered as a main indicator of acceptance in decision-making tasks characterized by perceptual uncertainty (e.g., evaluating the weight of two objects) and socio-cognitive uncertainty (e.g., evaluating which is the most suitable item in a specific context), and measured by the participants' conformation to the iCub's answers to specific questions. In particular, we were interested in understanding whether specific (i) user-related features (i.e. desire for control), (ii) robot-related features (i.e., attitude towards social influence of robots), and (iii) context-related features (i.e., collaborative vs. competitive scenario), may influence their trust towards the iCub robot. We found that participants conformed more to the iCub's answers when their decisions were about functional issues than when they were about social issues. Moreover, the few participants conforming to the iCub's answers for social issues also conformed less for functional issues. Trust in the robot's functional savvy does not thus seem to be a pre-requisite for trust in its social savvy. Finally, desire for control, attitude towards social influence of robots and type of interaction scenario did not influence the trust in iCub. Results are discussed with relation to methodology of HRI research.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Ilaria Gaudiello (1 paper)
  2. Elisabetta Zibetti (2 papers)
  3. Sebastien Lefort (2 papers)
  4. Mohamed Chetouani (36 papers)
  5. Serena Ivaldi (18 papers)
Citations (176)

Summary

Trust as an Indicator of Robot Functional and Social Acceptance: An Evaluation of Human Conformance to iCub's Decisions

This paper examines the acceptance of a humanoid robot, the iCub, by human users in both functional and social decision-making tasks under uncertainty. The experiment explores the role of trust as an indicator of robot acceptance, with a focus on conformation—users modifying their decisions to match the robot's. The authors employ a comprehensive method, considering user-related, robot-related, and context-related features that might influence trust.

Study Design and Methodology

The experimental design involved fifty-six adult participants interacting with the iCub robot. The participants confronted two types of tasks:

  1. Functional tasks: Participants assessed perceptual characteristics of stimuli, such as weight comparison, pitch, and predominant color, requiring objective measurements.
  2. Social tasks: These tasks included subjective evaluations regarding appropriateness of items within specific social contexts.

The paper utilized the Wizard-of-Oz paradigm, wherein participants believed they were interacting with an autonomous robot, though it was controlled remotely. Participants were exposed to scenarios: collaborative, competitive, and neutral.

Prior to robot interaction, participants completed two questionnaires aimed at assessing their desire for control and attitudes towards social influence of robots. This preparatory phase was designed to probe individual propensities influencing their conformation to the robot's decisions.

Findings

Results indicated that participants were more likely to conform in functional tasks than in social tasks. Only a minor subset trusted the robot's social savvy. This suggests that while functional acceptance based on high precision technical skills was acknowledged, social acceptance was less evident, perhaps due to the innate belief that robots lack subjective experience necessary for social savvy.

Interestingly, those conforming socially did not necessarily conform functionally, challenging the assumption that social trust arises from functional trust. This dichotomy implies that participants view the iCub differently as either a tool with technical expertise or a social entity capable of integrating into social tasks separately.

Implications

These observations underscore the complexities of human-robot trust dynamics, particularly in distinguishing functional savvy from social savvy. The results reflect a nuanced understanding of robot acceptance within varying contexts of uncertainty. Despite the societal push towards robots being dual-functional and socially capable, the paper reveals a potential dichotomy in user perception and expectations.

Future Directions

Future investigations should delve into whether the observed behavior depends on robots being perceived as "socially ignorant" or whether the intrinsic nature of tasks contributes to this trust divide. Additional research should encompass comparative interactions between humans and humanoid robots to illuminate distinctions in trust mechanisms across different agents.

Improving robot design to bridge the acceptance gap between functional and social savvy requires understanding how robots are perceived in terms of capability and reliability. Researchers could benefit from exploring the intersection of task nature, robot appearance, and cultural influences further. Additionally, expanding participant demographics and examining the longitudinal impact of repeated robot interactions could provide deeper insights into evolving trust dynamics.

In summary, this paper contributes significantly to understanding human-robot interaction dynamics and raises compelling inquiries for further exploration into establishing holistic trust in robots as integral partners in everyday tasks.