Abstract
This paper addresses the problem of probabilistically model- ing 3D human motion for synthesis and tracking. Given the high dimen- sional nature of human motion, learning an explicit probabilistic model from available training data is currently impractical. Instead we exploit methods from texture synthesis that treat images as representing an im- plicit empirical distribution. These methods replace the problem of rep- resenting the probability of a texture pattern with that of searching the training data for similar instances of that pattern. We extend this idea to temporal data representing 3D human motion with a large database of example motions. To make the method useful in practice, we must address the problem of e?cient search in a large training set; e?ciency is particularly important for tracking. Towards that end, we learn a low dimensional linear model of human motion that is used to structure the example motion database into a binary tree. An approximate probabilis- tic tree search method exploits the coe?cients of this low-dimensional representation and runs in sub-linear time. This probabilistic tree search returns a particular sample human motion with probability approximat- ing the true distribution of human motions in the database. This sam- pling method is suitable for use with particle ?ltering techniques and is applied to articulated 3D tracking of humans within a Bayesian frame- work. Successful tracking results are presented, along with examples of synthesizing human motion using the model.