Title
Maui: A Multimodal Affective User Interface
Keywords
Affect recognition; Emotions; Intelligent interfaces; Interface agent
Abstract
Human intelligence is being increasingly redefined to include the all-encompassing effect of emotions upon what used to be considered 'pure reason'. With the recent progress of research in computer vision, speech/prosody recognition, and bio-feedback, real-time recognition of affect will enhance human-computer interaction considerably, as well as assist further progress in the development of new emotion theories. In this article, we describe how affect, moods and emotions closely interact with cognition and how affect and emotion are the quintesseniial multimodal processes in humans. We then propose an adaptive system architecture designed to sense the user's emotional and affective states via three multimodal subsystems (V, K, A): namely (1) the Visual (from facial images and videos). (2) Kinesthetic (from autonomic nervous system (ANS) signals), and (3) Auditory (from speech). The results of the system sensing are then integrated into the multimodal perceived user's state. A multimodal anthropomorphic interface agent then adapts its interface by responding most appropriately to the current emotional states of its user, and provides intelligent multi-modal feedback to the user.
Publication Date
12-1-2002
Publication Title
Proceedings of the ACM International Multimedia Conference and Exhibition
Number of Pages
161-170
Document Type
Article; Proceedings Paper
Personal Identifier
scopus
Copyright Status
Unknown
Socpus ID
0038376887 (Scopus)
Source API URL
https://api.elsevier.com/content/abstract/scopus_id/0038376887
STARS Citation
Lisetti, Christine L. and Nasoz, Fatma, "Maui: A Multimodal Affective User Interface" (2002). Scopus Export 2000s. 2325.
https://stars.library.ucf.edu/scopus2000/2325