资源论文Maximum Classifier Discrepancy for Unsupervised Domain Adaptation

Maximum Classifier Discrepancy for Unsupervised Domain Adaptation

2019-10-14 | |  104 |   64 |   0

Abstract In this work, we present a method for unsupervised domain adaptation. Many adversarial learning methods train domain classififier networks to distinguish the features as either a source or target and train a feature generator network to mimic the discriminator. Two problems exist with these methods. First, the domain classififier only tries to distinguish the features as a source or target and thus does not consider task-specifific decision boundaries between classes. Therefore, a trained generator can generate ambiguous features near class boundaries. Second, these methods aim to completely match the feature distributions between different domains, which is diffificult because of each domain’s characteristics. To solve these problems, we introduce a new approach that attempts to align distributions of source and target by utilizing the task-specifific decision boundaries. We propose to maximize the discrepancy between two classififiers’ outputs to detect target samples that are far from the support of the source. A feature generator learns to generate target features near the support to minimize the discrepancy. Our method outperforms other methods on several datasets of image classifification and semantic segmentation. The codes are available at https://github. com/mil-tokyo/MCD_DA

上一篇:GeoNet: Unsupervised Learning of Dense Depth, Optical Flow and Camera Pose

下一篇:Occlusion Aware Unsupervised Learning of Optical Flow

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • Learning to learn...

    The move from hand-designed features to learned...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...