Multimodal Analysis for Identification and Segmentation of Moving-Sounding Objects
Abbreviated Journal Title
IEEE Trans. Multimedia
Audio-visual analysis; audio-visual synchronization; canonical; correlation analysis; video segmentation; CANONICAL CORRELATION-ANALYSIS; FUSION; LOCALIZATION; TRACKING; Computer Science, Information Systems; Computer Science, Software; Engineering; Telecommunications
In this paper, we propose a novel method that exploits correlation between audio-visual dynamics of a video to segment and localize objects that are the dominant source of audio. Our approach consists of a two-step spatiotemporal segmentation mechanism that relies on velocity and acceleration of moving objects as visual features. Each frame of the video is segmented into regions based on motion and appearance cues using the QuickShift algorithm, which are then clustered over time using K-means, so as to obtain a spatiotemporal video segmentation. The video is represented by motion features computed over individual segments. The Mel-Frequency Cepstral Coefficients (MFCC) of the audio signal, and their first order derivatives are exploited to represent audio. The proposed framework assumes there is a non-trivial correlation between these audio features and the velocity and acceleration of the moving and sounding objects. The canonical correlation analysis (CCA) is utilized to identify the moving objects which are most correlated to the audio signal. In addition to moving-sounding object identification, the same framework is also exploited to solve the problem of audio-video synchronization, and is used to aid interactive segmentation. We evaluate the performance of our proposed method on challenging videos. Our experiments demonstrate significant increase in performance over the state-of-the-art both qualitatively and quantitatively, and validate the feasibility and superiority of our approach.
Ieee Transactions on Multimedia
"Multimodal Analysis for Identification and Segmentation of Moving-Sounding Objects" (2013). Faculty Bibliography 2010s. 4143.