Abstract
We envision a future time when wearable cameras (e.g.,small cameras in glasses or pinned on a shirt collar) areworn by the masses and record first-person point-of-view(POV) videos of everyday life. While these cameras canenable new assistive technologies and novel research chal-lenges, they also raise serious privacy concerns. For ex-ample, first-person videos passively recorded by wearablecameras will necessarily include anyone who comes into theview of a camera – with or without consent. Motivated bythese benefits and risks, we develop a self-search techniquetailored to first-person POV videos. The key observationof our work is that the egocentric head motions of a targetperson (i.e., the self) are observed both in the POV video ofthe target and observer. The motion correlation between thetarget person’s video and the observer’s video can then beused to uniquely identify instances of the self. We incorpo-rate this feature into our proposed approach that computesthe motion correlation over supervoxel hierarchies to local-ize target instances in observer videos. Our proposed ap-proach significantly improves self-search performance over several well-known face detectors and recognizers. Furthermore, we show how our approach can enable several practical applications such as privacy filtering, automated video collection and social group discovery.