Title
Detection and representation of scenes in videos
Abbreviated Journal Title
IEEE Trans. Multimedia
Keywords
graph partitioning; key-frames; normalized cuts; scene; shot; video; segmentation; IMAGE SEGMENTATION; Computer Science, Information Systems; Computer Science, Software; Engineering; Telecommunications
Abstract
This paper presents a method to perform a high-level segmentation of videos into scenes. A scene can be defined as a subdivision of a play in which either the setting is fixed, or when it presents continuous action in one place. We exploit this fact and propose a novel approach for clustering shots into scenes by transforming this task into a graph partitioning problem. This is achieved by constructing a weighted undirected graph called a shot similarity graph (SSG), where each node represents a shot and the edges between the shots are weighted by their similarity based on color and motion information. The SSG is then split into subgraphs by applying the normalized cuts for graph partitioning. The partitions so obtained represent individual scenes in the video. When clustering the shots, we consider the global similarities of shots rather than the individual shot pairs. We also propose a method to describe the content of each scene by selecting one representative image from the video as a scene key-frame. Recently, DVDs have become available with a chapter selection option where each chapter is represented by one image. Our algorithm automates this objective which is useful for applications such as video-on-demand, digital libraries, and the Internet. Experiments are presented with promising results on several Hollywood movies and one sitcom.
Journal Title
Ieee Transactions on Multimedia
Volume
7
Issue/Number
6
Publication Date
1-1-2005
Document Type
Article
Language
English
First Page
1097
Last Page
1105
WOS Identifier
ISSN
1520-9210
Recommended Citation
"Detection and representation of scenes in videos" (2005). Faculty Bibliography 2000s. 5574.
https://stars.library.ucf.edu/facultybib2000/5574
Comments
Authors: contact us about adding a copy of your work at STARS@ucf.edu