A paradigm shift in interactive computing: Deriving multimodal design principles from behavioral and neurological foundations

Authors

    Authors

    K. Stanney; S. Samman; L. Reeves; K. Hale; W. Buff; C. Bowers; B. Goldiez; D. Nicholson;S. Lackey

    Comments

    Authors: contact us about adding a copy of your work at STARS@ucf.edu

    Abbreviated Journal Title

    Int. J. Hum.-Comput. Interact.

    Keywords

    ENDOGENOUS SPATIAL ATTENTION; AUDITORY LOCALIZATION CUES; AIDED; VISUAL-SEARCH; CROSS-MODAL LINKS; WORKING-MEMORY; SUPERIOR COLLICULUS; HUMAN BRAIN; VISION; INTEGRATION; PERCEPTION; Computer Science, Cybernetics; Ergonomics

    Abstract

    As technology advances, systems are increasingly able to provide more information than a human operator can process accurately. Thus, a challenge for designers is to create interfaces that allow operators to process the optimal amount of data. It is herein proposed that this may be accomplished by creating multimodal display systems that augment or switch modalities to maximize user information processing. Such a system would ultimately be informed by a user's neurophysiological state. As a first step toward that goal, relevant literature is reviewed and a set of preliminary design guidelines for multimodal information systems is suggested.

    Journal Title

    International Journal of Human-Computer Interaction

    Volume

    17

    Issue/Number

    2

    Publication Date

    1-1-2004

    Document Type

    Review

    Language

    English

    First Page

    229

    Last Page

    257

    WOS Identifier

    WOS:000222884900007

    ISSN

    1044-7318

    Share

    COinS