Title

Modeling Multimodal Expression Of User'S Affective Subjective Experience

Keywords

Affect; Embodiment; Emotion; Interaction; Perception; Subjective experience

Abstract

With the growing importance of information technology in our everyday life, new types of applications are appearing that require the understanding of information in a broad sense. Information that includes affective and subjective content plays a major role not only in an individual's cognitive processes but also in an individual's interaction with others. We identify three key points to be considered when developing systems that capture affective information: embodiment (experiencing physical reality), dynamics (mapping experience and emotional state with its label) and adaptive interaction (conveying emotive response, responding to a recognized emotional state). We present two computational systems that implement those principles: MOUE (Model Of User Emotions) is an emotion recognition system that recognizes the user's emotion from his/her facial expressions, and from it, adaptively builds semantic definitions of emotion concepts using the user's feedback; MIKE (Multimedia Interactive Environment for Kansei communication) is an interactive adaptive system that, along with the user, co-evolves a language for communicating over subjective impressions.

Publication Date

2-11-2002

Publication Title

User Modeling and User-Adapted Interaction

Volume

12

Issue

1

Number of Pages

49-84

Document Type

Article

Personal Identifier

scopus

DOI Link

https://doi.org/10.1023/A:1013365332180

Socpus ID

0036152561 (Scopus)

Source API URL

https://api.elsevier.com/content/abstract/scopus_id/0036152561

This document is currently not available here.

Share

COinS