Title
Tracking Across Multiple Cameras With Disjoint Views
Abstract
Conventional tracking approaches assume proximity in space, time and appearance of objects in successive observations. However, observations of objects are often widely separated in time and space when viewed from multiple non-overlapping cameras. To address this problem, we present a novel approach for establishing object correspondence across non-overlapping cameras. Our multi-camera tracking algorithm exploits the redundance in paths that people and cars tend to follow, e.g. roads, walk-ways or corridors, by using motion trends and appearance of objects, to establish correspondence. Our system does not require any inter-camera calibration, instead the system learns the camera topology and path probabilities of objects using Parzen windows, during a training phase. Once the training is complete, correspondences are assigned using the maximum a posteriori (MAP) estimation framework. The learned parameters are updated with changing trajectory patterns. Experiments with real world videos are reported, which validate the proposed approach.
Publication Date
1-1-2003
Publication Title
Proceedings of the IEEE International Conference on Computer Vision
Volume
2
Number of Pages
952-957
Document Type
Article; Proceedings Paper
Personal Identifier
scopus
DOI Link
https://doi.org/10.1109/iccv.2003.1238451
Copyright Status
Unknown
Socpus ID
0344983311 (Scopus)
Source API URL
https://api.elsevier.com/content/abstract/scopus_id/0344983311
STARS Citation
Javed, Omar; Rasheed, Zeeshan; and Shafique, Khurram, "Tracking Across Multiple Cameras With Disjoint Views" (2003). Scopus Export 2000s. 2082.
https://stars.library.ucf.edu/scopus2000/2082