资源论文Learning Local Invariant Mahalanobis Distances

Learning Local Invariant Mahalanobis Distances

2020-03-04 | |  59 |   45 |   0

Abstract

For many tasks and data types, there are natural transformations to which the data should be invariant or insensitive. For instance, in visual recognition, natural images should be insensitive to rotation and translation. This requirement and its implications have been important in many machine learning applications, and tolerance for image transformations was primarily achieved by using robust feature vectors. In this paper we propose a novel and computationally efficient way to learn a local Mahalanobis metric per datum, and show how we can learn a local invariant metric to any transformation in order to improve performance.

上一篇:Double Nystr¨om Method: An Efficient and Accurate Nystro?m Scheme for Large-Scale Data Sets

下一篇:Gradient-based Hyperparameter Optimization through Reversible Learning

用户评价
全部评价

热门资源

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • The Variational S...

    Unlike traditional images which do not offer in...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...