Title
Discovering Motion Primitives For Unsupervised Grouping And One-Shot Learning Of Human Actions, Gestures, And Expressions
Keywords
action recognition; action representation; facial expressions; gestures; Hidden Markov model; histogram of motion primitives; Human actions; motion patterns; motion primitives; motion primitives strings; one-shot learning; unsupervised clustering
Abstract
This paper proposes a novel representation of articulated human actions and gestures and facial expressions. The main goals of the proposed approach are: 1) to enable recognition using very few examples, i.e., one or k-shot learning, and 2) meaningful organization of unlabeled datasets by unsupervised clustering. Our proposed representation is obtained by automatically discovering high-level subactions or motion primitives, by hierarchical clustering of observed optical flow in four-dimensional, spatial, and motion flow space. The completely unsupervised proposed method, in contrast to state-of-the-art representations like bag of video words, provides a meaningful representation conducive to visual interpretation and textual labeling. Each primitive action depicts an atomic subaction, like directional motion of limb or torso, and is represented by a mixture of four-dimensional Gaussian distributions. For one-shot and k-shot learning, the sequence of primitive labels discovered in a test video are labeled using KL divergence, and can then be represented as a string and matched against similar strings of training videos. The same sequence can also be collapsed into a histogram of primitives or be used to learn a Hidden Markov model to represent classes. We have performed extensive experiments on recognition by one and k-shot learning as well as unsupervised action clustering on six human actions and gesture datasets, a composite dataset, and a database of facial expressions. These experiments confirm the validity and discriminative nature of the proposed representation. © 1979-2012 IEEE.
Publication Date
5-29-2013
Publication Title
IEEE Transactions on Pattern Analysis and Machine Intelligence
Volume
35
Issue
7
Number of Pages
1635-1648
Document Type
Article
Personal Identifier
scopus
DOI Link
https://doi.org/10.1109/TPAMI.2012.253
Copyright Status
Unknown
Socpus ID
84878144777 (Scopus)
Source API URL
https://api.elsevier.com/content/abstract/scopus_id/84878144777
STARS Citation
Yang, Yang; Saleemi, Imran; and Shah, Mubarak, "Discovering Motion Primitives For Unsupervised Grouping And One-Shot Learning Of Human Actions, Gestures, And Expressions" (2013). Scopus Export 2010-2014. 6995.
https://stars.library.ucf.edu/scopus2010/6995