Varying Social Cue Constellations Results In Different Attributed Social Signals In A Simulated Surveillance Task
Abstract
A better understanding of human mental states in social contexts holds the potential to pave the way for implementation of robotic systems capable of more natural and intuitive interaction. In working toward such a goal, this paper reports on a study examining human perception of social signals based on manipulated sets of social cues in a simulated socio-cultural environment. Participants were presented with video vignettes of a simulated marketplace environment in which they took the perspective of an observing robot and were asked to make mental state attributions of a human avatar based on the avatar's expression of a range of social cues. Results indicated that subtly varying combinations of social cues led to participants' perception of different social signals. The different mental state attributions made were also significantly associated with what participants considered an appropriate behavioral response for the robot to exhibit in relation to the avatar. We discuss these results in the context of the development of computational-based perceptual systems to be implemented in socially intelligent robots.
Publication Date
1-1-2015
Publication Title
Proceedings of the 28th International Florida Artificial Intelligence Research Society Conference, FLAIRS 2015
Number of Pages
61-66
Document Type
Article; Proceedings Paper
Personal Identifier
scopus
Copyright Status
Unknown
Socpus ID
84958160721 (Scopus)
Source API URL
https://api.elsevier.com/content/abstract/scopus_id/84958160721
STARS Citation
Lobato, Emilio J.C.; Warta, Samantha F.; Wiltshire, Travis J.; and Fiore, Stephen M., "Varying Social Cue Constellations Results In Different Attributed Social Signals In A Simulated Surveillance Task" (2015). Scopus Export 2015-2019. 2023.
https://stars.library.ucf.edu/scopus2015/2023