Agileslam: A Localization Approach For Agile Head Movements In Augmented Reality

Keywords

Artificial Intelligence; Computer Vision; Computing Methodologies; Human Computer Interaction; Human-centered computing; Interaction Paradigms; Mixed / Augmented Reality; Tracking

Abstract

Realistic augmented reality systems require both accurate localization of the user and a mapping of the environment. In a markerless environment this is often done with SLAM algorithms which, for localization, pick out features in the environment and compare how they have changed from keyframe to current frame. However, human head agility, such as seen in video gaming tasks or training exercises, poses a problem; fast rotations will cause all previously tracked features to no longer be within the field of view and the system will struggle to localize accurately. In this paper we present an approach that is capable of tracking a human head's agile movements by using an array of RGB-D sensors and a reconstruction of this sensor data into 360 degrees of features that is fed into our SLAM algorithm. We run an experiment with pre-recorded agile movement scenarios that demonstrate the accuracy of our system. We also compare our approach against single sensor algorithms and show a significant improvement (up to 15 to 20 times better accuracy) in localization. The development of our sensor array and SLAM algorithm creates a novel approach to accurately localize extremely agile human head movements.

Publication Date

7-2-2018

Publication Title

Adjunct Proceedings - 2018 IEEE International Symposium on Mixed and Augmented Reality, ISMAR-Adjunct 2018

Number of Pages

25-30

Document Type

Article; Proceedings Paper

Personal Identifier

scopus

DOI Link

https://doi.org/10.1109/ISMAR-Adjunct.2018.00025

Socpus ID

85065535049 (Scopus)

Source API URL

https://api.elsevier.com/content/abstract/scopus_id/85065535049

This document is currently not available here.

Share

COinS