Adaptive Filtering Of Physical-Virtual Artifacts For Synthetic Animatronics

Keywords

Computing methodologies → Rendering; Mixed / augmented reality; Perception; Simulation support systems

Abstract

Spatial Augmented Reality (SAR), e.g., based on monoscopic projected imagery on physical three-dimensional (3D) surfaces, can be particularly well-suited for ad hoc group or multi-user augmented reality experiences since it does not encumber users with head-worn or carried devices. However, conveying a notion of realistic 3D shapes and movements on SAR surfaces using monoscopic imagery is a difficult challenge. While previous work focused on physical actuation of such surfaces to achieve geometrically dynamic content, we introduce a different concept, which we call "Synthetic Animatronics," i.e., conveying geometric movement or deformation purely through manipulation of the imagery being shown on a static display surface. We present a model for the distribution of the viewpoint-dependent distortion that occurs when there are discrepancies between the physical display surface and the virtual object being represented, and describe a realtime implementation for a method of adaptively filtering the imagery based on an approximation of expected potential error. Finally, we describe an existing physical SAR setup well-suited for synthetic animatronics and a corresponding Unity-based SAR simulator allowing for flexible exploration and validation of the technique and various parameters.

Publication Date

1-1-2018

Publication Title

ICAT-EGVE 2018 - 28th International Conference on Artificial Reality and Telexistence and 23rd Eurographics Symposium on Virtual Environments

Number of Pages

65-72

Document Type

Article; Proceedings Paper

Personal Identifier

scopus

DOI Link

https://doi.org/10.2312/egve.20181316

Socpus ID

85100366224 (Scopus)

Source API URL

https://api.elsevier.com/content/abstract/scopus_id/85100366224

This document is currently not available here.

Share

COinS