Touch Sensing On Non-Parametric Rear-Projection Surfaces: A Physical-Virtual Head For Hands-On Healthcare Training
Keywords
H.5.1 [Information Interfaces and Presentation]: Multimedia Information Systems - Animations, Artificial, Augmented, and Virtual Realities; I.3.7 [Computer Graphics]: Three-Dimensional Graphics and Realism - Virtual Reality; I.3.8 [Computer Graphics]: Applications
Abstract
We demonstrate a generalizable method for unified multitouch detection and response on a human head-shaped surface with a rear-projection animated 3D face. The method helps achieve hands-on touch-sensitive training with dynamic physical-virtual patient behavior. The method, which is generalizable to other non-parametric rear-projection surfaces, requires one or more infrared (IR) cameras, one or more projectors, IR light sources, and a rear-projection surface. IR light reflected off of human fingers is captured by cameras with matched IR pass filters, allowing for the localization of multiple finger touch events. These events are tightly coupled with the rendering system to produce auditory and visual responses on the animated face displayed using the projector(s), resulting in a responsive, interactive experience. We illustrate the applicability of our physical prototype in a medical training scenario.
Publication Date
8-25-2015
Publication Title
2015 IEEE Virtual Reality Conference, VR 2015 - Proceedings
Number of Pages
69-74
Document Type
Article; Proceedings Paper
Personal Identifier
scopus
DOI Link
https://doi.org/10.1109/VR.2015.7223326
Copyright Status
Unknown
Socpus ID
84954496838 (Scopus)
Source API URL
https://api.elsevier.com/content/abstract/scopus_id/84954496838
STARS Citation
Hochreiter, Jason; Daher, Salam; Nagendran, Arjun; Gonzalez, Laura; and Welch, Greg, "Touch Sensing On Non-Parametric Rear-Projection Surfaces: A Physical-Virtual Head For Hands-On Healthcare Training" (2015). Scopus Export 2015-2019. 2042.
https://stars.library.ucf.edu/scopus2015/2042