Analysis Of Proximity-Based Multimodal Feedback For 3D Selection In Immersive Virtual Environments

Keywords

Evaluation; H.5.2 [Information Interfaces and Presentation]: User Interfaces-Input Devices and Strategies

Abstract

Interaction tasks in virtual reality (VR) such as three-dimensional (3D) selection or manipulation of objects often suffer from reduced performance due to missing or different feedback provided by VR systems than during corresponding realworld interactions. Vibrotactile and auditory feedback have been suggested as additional perceptual cues complementing the visual channel to improve interaction in VR. However, it has rarely been shown that multimodal feedback improves performance or reduces errors during 3D object selection. Only little research has been conducted in the area of proximity-based multimodal feedback, in which stimulus intensities depend on spatiotemporal relations between input device and the virtual target object. In this paper, we analyzed the effects of unimodal and bimodal feedback provided through the visual, auditory and tactile modalities, while users perform 3D object selections in VEs, by comparing both binary and continuous proximity-based feedback. We conducted a Fitts' Law experiment and evaluated the different feedback approaches. The results show that the feedback types affect ballistic and correction phases of the selection movement, and significantly influence the user performance.

Publication Date

8-24-2018

Publication Title

25th IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2018 - Proceedings

Number of Pages

327-334

Document Type

Article; Proceedings Paper

Personal Identifier

scopus

DOI Link

https://doi.org/10.1109/VR.2018.8446317

Socpus ID

85053818215 (Scopus)

Source API URL

https://api.elsevier.com/content/abstract/scopus_id/85053818215

This document is currently not available here.

Share

COinS