Motion tracking is a well-defined yet application-specific problem of computer vision field, mostly entailing real-time constraints. Methods addressing such problems are expected also to ensure achievements such as high accuracy and robustness. A probabilistic estimation-based approach is proposed in this paper, in order to enhance the real-time motion tracking process of an RGB-Depth device, in terms of accuracy. A novel method is presented for tracking handpalm of a moving human subject to this end, under a sequence of assumptions such as indoor environment, single object, smooth movement and stable illumination. Tracking accuracy is improved within a particle filter framework by fusing device output with the newly extracted information from RGB and depth images. Experimental results are shared revealing the advantages of the proposed method over the built-in device algorithms. The results demonstrate that the proposed method produces smaller RMSE values both for single implementations and multiexecution trials without violating real-time constraints.
Motion tracking, data-fusion, particle filter
TAŞCI, TUĞRUL and ÇELEBİ, NUMAN
"Real-time motion tracking enhancement via data-fusion based particle filter,"
Turkish Journal of Electrical Engineering and Computer Sciences: Vol. 29:
5, Article 14.
Available at: https://journals.tubitak.gov.tr/elektrik/vol29/iss5/14