资源论文The Pessimistic Limits and Possibilities of Margin-based Losses in Semi-supervised Learning

The Pessimistic Limits and Possibilities of Margin-based Losses in Semi-supervised Learning

2020-02-13 | |  88 |   45 |   0

Abstract

Consider a classification problem where we have both labeled and unlabeled data available. We show that for linear classifiers defined by convex margin-based surrogate losses that are decreasing, it is impossible to construct any semi-supervised approach that is able to guarantee an improvement over the supervised classifier measured by this surrogate loss on the labeled and unlabeled data. For convex margin-based loss functions that also increase, we demonstrate safe improvements are possible.

上一篇:Semi-supervised Deep Kernel Learning: Regression with Unlabeled Data by Minimizing Predictive Variance

下一篇:Semi-Supervised Learning with Declaratively Specified Entropy Constraints

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • Learning to learn...

    The move from hand-designed features to learned...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...