Abstract
We analyze walking people using a gait sequence represen- tation that bypasses the need for frame-to-frame tracking of body parts. The gait representation maps a video sequence of silhouettes into a pair of two-dimensional spatio-temporal patterns that are near-periodic along the time axis. Mathematically, such patterns are called “frieze” patterns and associated symmetry groups “frieze groups”. With the help of a walking humanoid avatar, we explore variation in gait frieze patterns with respect to viewing angle, and find that the frieze groups of the gait patterns and their canonical tiles enable us to estimate viewing direction of human walking videos. In addition, analysis of periodic patterns al- lows us to determine the dynamic time warping and afine scaling that aligns two gait sequences from similar viewpoints. We also show how gait alignment can be used to perform human identification and model-based body part segmentation.