MAPTrack: A probabilistic real time tracking framework by integrating motion, appearance and position models
Document Type
Conference Proceeding
Publication Date
1-1-2015
Abstract
In this paper, we present MAPTrack - a robust tracking framework that uses a probabilistic scheme to combine a motion model of an object with that of its appearance and an estimation of its position. The motion of the object is modelled using the Gaussian Mixture Background Subtraction algorithm, the appearance of the tracked object is enumerated using a color histogram and the projected location of the tracked object in the image space/frame sequence is computed by applying a Gaussian to the Region of Interest. Our tracking framework is robust to abrupt changes in lighting conditions, can follow an object through occlusions, and can simultaneously track multiple moving foreground objects of different types (e.g., vehicles, human, etc.) even when they are closely spaced. It is able to start tracks automatically based on a spatio-temporal filtering algorithm. A "dynamic" integration of the framework with optical flow allows us to track videos resulting from significant camera motion. A C++ implementation of the framework has outperformed existing visual tracking algorithms on most videos in the Video Image Retrieval and Analysis Tool (VIRAT), TUD, and the Tracking-Learning-Detection (TLD) datasets.
Publication Source (Journal or Book title)
Visapp 2015 10th International Conference on Computer Vision Theory and Applications Visigrapp Proceedings
First Page
567
Last Page
574
Recommended Citation
Basu, S., Karki, M., Stagg, M., DiBiano, R., Ganguly, S., & Mukhopadhyay, S. (2015). MAPTrack: A probabilistic real time tracking framework by integrating motion, appearance and position models. Visapp 2015 10th International Conference on Computer Vision Theory and Applications Visigrapp Proceedings, 3, 567-574. https://doi.org/10.5220/0005309805670574