Abstract
Long-term modeling of background motion in videos isan important and challenging problem used in numerousapplications such as segmentation and event recognition. Amajor challenge in modeling the background from point tra-jectories lies in dealing with the variable length duration of trajectories, which can be due to such factors as trajectories entering and leaving the frame or occlusion from different depth layers. This work proposes an online method for background modeling of dynamic point trajectories viatracking of a linear subspace describing the backgroundmotion. To cope with variability in trajectory durations,we cast subspace tracking as an instance of subspace estimation under missing data, using a least-absolute deviations formulation to robustly estimate the background in the presence of arbitrary foreground motion. Relative to previous works, our approach is very fast and scales to arbitrarily long videos as our method processes new frames sequentially as they arrive.