资源论文Beyond Similar and Dissimilar Relations: A Kernel Regression Formulation for Metric Learning

Beyond Similar and Dissimilar Relations: A Kernel Regression Formulation for Metric Learning

2019-11-05 | |  56 |   56 |   0
Abstract Most existing metric learning methods focus on learning a similarity or distance measure relying on similar and dissimilar relations between sample pairs. However, pairs of samples cannot be simply identified as similar or dissimilar in many realworld applications, e.g., multi-label learning, label distribution learning and tasks with continuous decision values. To this end, in this paper we propose a novel relation alignment metric learning (RAML) formulation to handle the metric learning problem in those scenarios. Since the relation of two samples can be measured by the difference degree of the decision values, motivated by the consistency of the sample relations in the feature space and decision space, our proposed RAML utilizes the sample relations in the decision space to guide the metric learning in the feature space. In this way, our RAML method formulates metric learning as a kernel regression problem, which can be efficiently optimized by the standard regression solvers. We carry out several experiments on the single-label classification, multi-label classification, and label distribution learning tasks, to demonstrate that our method achieves favorable performance against the state-of-the-art methods.

上一篇:Towards Generalized and Efficient Metric Learning on Riemannian Manifold

下一篇:Fast Model Identification via Physics Engines for Data-Efficient Policy Search

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • Learning to learn...

    The move from hand-designed features to learned...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...