Abstract
A novel memory-based particle filter is proposed to achieve robust visual tracking of a target’s pose even with large variations in tar- get’s position and rotation, i.e. large appearance changes. The memory- based particle filter (M-PF) is a recent extension of the particle filter, and incorporates a memory-based mechanism to predict prior distribu- tion using past memory of target state sequence; it offers robust target tracking against complex motion. This paper extends the M-PF to a uni- fied probabilistic framework for joint estimation of the target’s pose and appearance based on memory-based joint prior prediction using stored past pose and appearance sequences. We call it the Memory-based Par- ticle Filter with Appearance Prediction (M-PFAP). A memory-based approach enables generating the joint prior distribution of pose and ap- pearance without explicit modeling of the complex relationship between them. M-PFAP can robustly handle the large changes in appearance caused by large pose variation, in addition to abrupt changes in moving direction; it allows robust tracking under self and mutual occlusion. Ex- periments confirm that M-PFAP successfully tracks human faces from frontal view to profile view; it greatly eases the limitations of M-PF.