Title
Exploring Head Tracked Head Mounted Displays For First Person Robot Teleoperation
Keywords
3D Interaction; Robots; User Studies
Abstract
We explore the capabilities of head tracking combined with head mounted displays (HMD) as an input modality for robot navigation. We use a Parrot AR Drone to test five techniques which include metaphors for plane-like banking control, carlike turning control and virtual reality-inspired translation and rotation schemes which we compare with a more traditional game controller interface. We conducted a user study to observe the effectiveness of each of the interfaces we developed in navigating through a number of archways in an indoor course. We examine a number of qualitative and quantitative metrics to determine performance and preference among each metaphor. Our results show an appreciation for head rotation based controls over other head gesture techniques, with the classic controller being preferred overall. We discuss possible shortcomings with head tracked HMDs as a primary input method as well as propose improved metaphors that alleviate some of these drawbacks. © 2014 ACM.
Publication Date
3-14-2014
Publication Title
International Conference on Intelligent User Interfaces, Proceedings IUI
Number of Pages
323-328
Document Type
Article; Proceedings Paper
Personal Identifier
scopus
DOI Link
https://doi.org/10.1145/2557500.2557527
Copyright Status
Unknown
Socpus ID
84897818610 (Scopus)
Source API URL
https://api.elsevier.com/content/abstract/scopus_id/84897818610
STARS Citation
Pittman, Corey and LaViola, Joseph J., "Exploring Head Tracked Head Mounted Displays For First Person Robot Teleoperation" (2014). Scopus Export 2010-2014. 8846.
https://stars.library.ucf.edu/scopus2010/8846