View invariant action recognition using projective depth

Authors

    Authors

    N. Ashraf; C. Sun;H. Foroosh

    Comments

    Authors: contact us about adding a copy of your work at STARS@ucf.edu

    Abbreviated Journal Title

    Comput. Vis. Image Underst.

    Keywords

    View invariance; Action recognition; Projective depth; SPACE; REPRESENTATION; MACHINE; FLOW; Computer Science, Artificial Intelligence; Engineering, Electrical &; Electronic

    Abstract

    In this paper, we investigate the concept of projective depth, demonstrate its application and significance in view-invariant action recognition. We show that projective depths are invariant to camera internal parameters and orientation, and hence can be used to identify similar motion of body-points from varying viewpoints. By representing the human body as a set of points, we decompose a body posture into a set of projective depths. The similarity between two actions is, therefore, measured by the motion of projective depths. We exhaustively investigate the different ways of extracting planes, which can be used to estimate the projective depths for use in action recognition including (i) ground plane, (ii) body-point triplets, (iii) planes in time, and (iv) planes extracted from mirror symmetry. We analyze these different techniques and analyze their efficacy in view-invariant action recognition. Experiments are performed on three categories of data including the CMU MoCap dataset, Kinect dataset, and IXMAS dataset. Results evaluated over semi-synthetic video data and real data confirm that our method can recognize actions, even when they have dynamic timeline maps, and the viewpoints and camera parameters are unknown and totally different. (C) 2014 Elsevier Inc. All rights reserved.

    Journal Title

    Computer Vision and Image Understanding

    Volume

    123

    Publication Date

    1-1-2014

    Document Type

    Article

    Language

    English

    First Page

    41

    Last Page

    52

    WOS Identifier

    WOS:000335488600004

    ISSN

    1077-3142

    Share

    COinS