Abstract
In this paper, a robust visual tracking method is proposed to track an object in dynamic conditions that include motion blur, illumination changes, pose variations, and occlusions. To cope with these challenges, multiple trackers with different feature descriptors are utilized, and each of which shows different level of robustness to certain changes in an object ’s appearance. To fuse these independent trackers, we propose two configurations, tracker selection and inter- action. The tracker interaction is achieved based on a transition probability matrix (TPM) in a probabilistic manner. The tracker selection extracts one tracking result from among multiple tracker outputs by choosing the tracker that has the highest tracker probability. According to various changes in an object ’s appearance, the TPM and tracker probability are updated in a recursive Bayesian form by evalu- ating each tracker’s reliability, which is measured by a robust tracker likelihood function (TLF). When the tracking in each frame is completed, the estimated ob- ject ’s state is obtained and fed into the reference update via the proposed learning strategy, which retains the robustness and adaptability of the TLF and multiple trackers. The experimental results demonstrate that our proposed method is ro- bust in various benchmark scenarios.