Abstract
PatchMatch is a fast algorithm for computing dense approx- imate nearest neighbor correspondences between patches of two image regions [1]. This paper generalizes PatchMatch in three ways: (1) to find k nearest neighbors, as opposed to just one, (2) to search across scales and rotations, in addition to just translations, and (3) to match using arbi- trary descriptors and distances, not just sum-of-squared-differences on patch colors. In addition, we offer new search and parallelization strate- gies that further accelerate the method, and we show performance im- provements over standard kd-tree techniques across a variety of inputs. In contrast to many previous matching algorithms, which for efficiency rea- sons have restricted matching to sparse interest points, or spatially prox- imate matches, our algorithm can efficiently find global, dense matches, even while matching across all scales and rotations. This is especially useful for computer vision applications, where our algorithm can be used as an efficient general-purpose component. We explore a variety of vi- sion applications: denoising, finding forgeries by detecting cloned regions, symmetry detection, and ob ject detection.