Abstract
The appearance of (outdoor) scenes changes consider-ably with the strength of certain transient attributes, suchas “rainy”, “dark” or “sunny”. Obviously, this also affects the representation of an image in feature space, e.g.,as activations at a certain CNN layer, and consequently impacts scene recognition performance. In this work, we in-vestigate the variability in these transient attributes as a rich source of information for studying how image representations change as a function of attribute strength. Inparticular, we leverage a recently introduced dataset withfine-grain annotations to estimate feature trajectories for a collection of transient attributes and then show how thesetrajectories can be transferred to new image representa-tions. This enables us to synthesize new data along the transferred trajectories with respect to the dimensions of the space spanned by the transient attributes. Applicability of this concept is demonstrated on the problem of one-shot recognition of scene locations. We show that data synthesized via feature trajectory transfer considerably boosts recognition performance, (1) with respect to baselines and (2) in combination with state-of-the-art approaches in oneshot learning.