资源论文Squared-loss Mutual Information Regularization: A Novel Information-theoretic Approach to Semi-supervised Learning

Squared-loss Mutual Information Regularization: A Novel Information-theoretic Approach to Semi-supervised Learning

2020-03-03 | |  111 |   55 |   0

Abstract

We propose squared-loss mutual information regularization (SMIR) for multi-class probabilistic classification, following the information maximization principle. SMIR is convex under mild conditions and thus improves the nonconvexity of mutual information regularization. It offers all of the following four abilities to semi-supervised algorithms: Analytical solution, out-of-sample/multi-class classification, and probabilistic output. Furthermore, novel generalization error bounds are derived. Experiments show SMIR compares favorably with state-of-the-art methods.

上一篇:Infinitesimal Annealing for Training Semi-Supervised Support Vector Machines

下一篇:Efficient Semi-supervised and Active Learning of Disjunctions

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...