Assessing Multimodal Interactions With Mixed-Initiative Teams
Keywords
Human-robot interaction; Multimodal interfaces; Simulation; Tactile displays
Abstract
The state-of-the-art in robotics is advancing to support the warfighters’ ability to project force and increase their reach across a variety of future missions. Seamless integration of robots with the warfighter will require advancing interfaces from teleoperation to collaboration. The current approach to meeting this requirement is to include human-to-human communication capabilities in tomorrow’s robots using multimodal communication. Though advanced, today’s robots do not yet come close to supporting teaming in dismounted military operations, and therefore simulation is required for developers to assess multimodal interfaces in complex multi-tasking scenarios. This paper describes existing and future simulations to support assessment of multimodal human-robot interaction in dismounted soldier-robot teams.
Publication Date
1-1-2018
Publication Title
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume
10904 LNCS
Number of Pages
175-184
Document Type
Article; Proceedings Paper
Personal Identifier
scopus
DOI Link
https://doi.org/10.1007/978-3-319-92043-6_15
Copyright Status
Unknown
Socpus ID
85050384405 (Scopus)
Source API URL
https://api.elsevier.com/content/abstract/scopus_id/85050384405
STARS Citation
Barber, Daniel, "Assessing Multimodal Interactions With Mixed-Initiative Teams" (2018). Scopus Export 2015-2019. 9465.
https://stars.library.ucf.edu/scopus2015/9465