资源论文Learning Robust Distance Metric with Side Information via Ratio Minimization of Orthogonally Constrained `2,1-Norm Distances

Learning Robust Distance Metric with Side Information via Ratio Minimization of Orthogonally Constrained `2,1-Norm Distances

2019-10-08 | |  38 |   32 |   0
Abstract Metric Learning, which aims at learning a distance metric for a given data set, plays an important role in measuring the distance or similarity between data objects. Due to its broad usefulness, it has attracted a lot of interest in machine learning and related areas in the past few decades. This paper proposes to learn the distance metric from the side information in the forms of must-links and cannot-links. Given the pairwise constraints, our goal is to learn a Mahalanobis distance that minimizes the ratio of the distances of the data pairs in the must-links to those in the cannot-links. Different from many existing papers that use the traditional squared `2-norm distance, we develop a robust model that is less sensitive to data noise or outliers by using the not-squared `2-norm distance. In our objective, the orthonormal constraint is enforced to avoid degenerate solutions. To solve our objective, we have derived an efficient iterative solution algorithm. We have conducted extensive experiments, which demonstrated the superiority of our method over state-of-the-art

上一篇:Learning Multiple Maps from Conditional Ordinal Triplets

下一篇:Learning Sound Events from Webly Labeled Data

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • Learning to learn...

    The move from hand-designed features to learned...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Joint Pose and Ex...

    Facial expression recognition (FER) is a challe...