资源论文Information-theoretic Semi-supervised Metric Learning via Entropy Regularization

Information-theoretic Semi-supervised Metric Learning via Entropy Regularization

2020-03-02 | |  85 |   44 |   0

Abstract

We propose a general information-theoretic approach called S ERAPH (SEmi-supervised metRic leArning Paradigm with Hyper-sparsity) for metric learning that does not rely upon the manifold assumption. Given the probability parameterized by a Mahalanobis distance, we maximize the entropy of that probability on labeled data and minimize it on unlabeled data following entropy regularization, which allows the supervised and unsupervised parts to be integrated in a natural and meaningful way. Furthermore, S ERAPH is regularized by encouraging a low-rank projection induced from the metric. The optimization of S ER APH is solved efficiently and stably by an EMlike scheme with the analytical E-Step and convex M-Step. Experiments demonstrate that S ER APH compares favorably with many well-known global and local metric learning methods.

上一篇:Semi-Supervised Learning of Class Balance under Class-Prior Change by Distribution Matching

下一篇:On the Sample Complexity of Reinforcement Learning with a Generative Model

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Joint Pose and Ex...

    Facial expression recognition (FER) is a challe...