Title
Building Appropriate Trust In Human-Robot Teams
Abstract
Future robotic systems are expected to transition from tools to teammates, characterized by increasingly autonomous, intelligent robots interacting with humans in a more naturalistic manner, approaching a relationship more akin to human-human teamwork. Given the impact of trust observed in other systems, trust in the robot team member will likely be critical to effective and safe performance. Our thesis for this paper is that trust in a robot team member must be appropriately calibrated rather than simply maximized. We describe how the human team member's understanding of the system contributes to trust in human-robot teaming, by evoking mental model theory. We discuss how mental models are related to physical and behavioral characteristics of the robot, on the one hand, and affective and behavioral outcomes, such as trust and system use/disuse/misuse, on the other. We expand upon our discussion by providing recommendations for best practices in human-robot team research and design and other systems using artificial intelligence. © 2013, Association for the Advancement of artificial intelligence.
Publication Date
9-9-2013
Publication Title
AAAI Spring Symposium - Technical Report
Volume
SS-13-07
Number of Pages
60-65
Document Type
Article; Proceedings Paper
Personal Identifier
scopus
Copyright Status
Unknown
Socpus ID
84883411398 (Scopus)
Source API URL
https://api.elsevier.com/content/abstract/scopus_id/84883411398
STARS Citation
Ososky, Scott; Schuster, David; Phillips, Elizabeth; and Jentsch, Florian, "Building Appropriate Trust In Human-Robot Teams" (2013). Scopus Export 2010-2014. 6227.
https://stars.library.ucf.edu/scopus2010/6227