A Novel Approach For Cooperative Motion Capture (Comocap)
Keywords
Computing methodologies → Motion capture; Graphics input devices; Graphics input devices; Human-centered computing → Mixed / augmented reality; Mixed / augmented reality; Motion capture; Virtual reality; Virtual reality
Abstract
Conventional motion capture (MOCAP) systems, e.g., optical systems, typically perform well for one person, but less so for multiple people in close proximity. Measurement quality can decline with distance, and even drop out as source/sensor components are occluded by nearby people. Furthermore, conventional optical MOCAP systems estimate body posture using a global estimation approach employing cameras that are fixed in the environment, typically at a distance such that one person or object can easily occlude another, and the relative error between tracked objects in the scene can increase as they move farther from the cameras and/or closer to each other. Body-relative tracking approaches use body-worn sensors and/or sources to track limbs with respect to the head or torso, for example, taking advantage of the proximity of limbs to the body. We present a novel approach to MOCAP that combines and extends conventional global and body-relative approaches by distributing both sensing and active signaling over each person's body to facilitate body-relative (intra-user) MOCAP for one person and body-body (inter-user) MOCAP for multiple people, in an approach we call cooperative motion capture (COMOCAP). We support the validity of the approach with simulation results from a system comprised of acoustic transceivers (receiver-transmitter units) that provide inter-transceiver range measurements. Optical, magnetic, and other types of transceivers could also be used. Our simulations demonstrate the advantages of this approach to effectively improve accuracy and robustness to occlusions in situations of close proximity between multiple persons.
Publication Date
1-1-2018
Publication Title
ICAT-EGVE 2018 - 28th International Conference on Artificial Reality and Telexistence and 23rd Eurographics Symposium on Virtual Environments
Number of Pages
73-80
Document Type
Article; Proceedings Paper
Personal Identifier
scopus
DOI Link
https://doi.org/10.2312/egve.20181317
Copyright Status
Unknown
Socpus ID
85121632245 (Scopus)
Source API URL
https://api.elsevier.com/content/abstract/scopus_id/85121632245
STARS Citation
Welch, Gregory; Wang, Tianren; Bishop, Gary; and Bruder, Gerd, "A Novel Approach For Cooperative Motion Capture (Comocap)" (2018). Scopus Export 2015-2019. 10544.
https://stars.library.ucf.edu/scopus2015/10544