Modeling multimodal expression of user's affective subjective experience

Authors

    Authors

    N. Bianchi-Berthouze;C. L. Lisetti

    Comments

    Authors: contact us about adding a copy of your work at STARS@ucf.edu

    Abbreviated Journal Title

    User Model. User-Adapt. Interact.

    Keywords

    affect; embodiment; emotion; interaction; perception; subjective; experience; FACIAL EXPRESSIONS; EMOTIONS; IMAGES; Computer Science, Cybernetics

    Abstract

    With the growing importance of information technology in our everyday life, new types of applications are appearing that require the understanding of information in a broad sense. Information that includes affective and subjective content plays a major role not only in an individual's cognitive processes but also in an individual's interaction with others. We identify three key points to be considered when developing systems that capture affective information: embodiment (experiencing physical reality), dynamics (mapping experience and emotional state with its label) and adaptive interaction (conveying emotive response, responding to a recognized emotional state). We present two computational systems that implement those principles: MOUE (Model Of User Emotions) is an emotion recognition system that recognizes the user's emotion from his/her facial expressions, and from it, adaptively builds semantic definitions of emotion concepts using the user's feedback; MIKE (Multimedia Interactive Environment for Kansei communication) is an interactive adaptive system that, along with the user, co-evolves a language for communicating over subjective impressions.

    Journal Title

    User Modeling and User-Adapted Interaction

    Volume

    12

    Issue/Number

    1

    Publication Date

    1-1-2002

    Document Type

    Article

    Language

    English

    First Page

    49

    Last Page

    84

    WOS Identifier

    WOS:000172862500002

    ISSN

    0924-1868

    Share

    COinS