Abstract
Scene parsing has attracted a lot of attention in com-puter vision. While parametric models have proven effec-tive for this task, they cannot easily incorporate new train-ing data. By contrast, nonparametric approaches, whichbypass any learning phase and directly transfer the labelsfrom the training data to the query images, can readily ex-ploit new labeled samples as they become available. Un-fortunately, because of the computational cost of their labeltransfer procedures, state-of-the-art nonparametric meth-ods typically filter out most training images to only keep afew relevant ones to label the query. As such, these meth-ods throw away many images that still contain valuableinformation and generally obtain an unbalanced set of la-beled samples. In this paper, we introduce a nonparamet-ric approach to scene parsing that follows a sample-andfilter strategy. More specifically, we propose to sample labeled superpixels according to an image similarity score, which allows us to obtain a balanced set of samples. Wethen formulate label transfer as an efficient filtering proce-dure, which lets us exploit more labeled samples than existing techniques. Our experiments evidence the benefits of our approach over state-of-the-art nonparametric methods on two benchmark datasets.