Title
Complex Events Detection Using Data-Driven Concepts
Abstract
Automatic event detection in a large collection of unconstrained videos is a challenging and important task. The key issue is to describe long complex video with high level semantic descriptors, which should find the regularity of events in the same category while distinguish those from different categories. This paper proposes a novel unsupervised approach to discover data-driven concepts from multi-modality signals (audio, scene and motion) to describe high level semantics of videos. Our methods consists of three main components: we first learn the low-level features separately from three modalities. Secondly we discover the data-driven concepts based on the statistics of learned features mapped to a low dimensional space using deep belief nets (DBNs). Finally, a compact and robust sparse representation is learned to jointly model the concepts from all three modalities. Extensive experimental results on large in-the-wild dataset show that our proposed method significantly outperforms state-of-the-art methods. © 2012 Springer-Verlag.
Publication Date
10-30-2012
Publication Title
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume
7574 LNCS
Issue
PART 3
Number of Pages
722-735
Document Type
Article; Proceedings Paper
Personal Identifier
scopus
DOI Link
https://doi.org/10.1007/978-3-642-33712-3_52
Copyright Status
Unknown
Socpus ID
84867886443 (Scopus)
Source API URL
https://api.elsevier.com/content/abstract/scopus_id/84867886443
STARS Citation
Yang, Yang and Shah, Mubarak, "Complex Events Detection Using Data-Driven Concepts" (2012). Scopus Export 2010-2014. 4686.
https://stars.library.ucf.edu/scopus2010/4686