资源论文Unsupervised Domain Adaptation for Semantic Segmentation via Class-Balanced Self-Training

Unsupervised Domain Adaptation for Semantic Segmentation via Class-Balanced Self-Training

2019-10-29 | |  84 |   50 |   0

Abstract. Recent deep networks achieved state of the art performance on a variety of semantic segmentation tasks. Despite such progress, these models often face challenges in real world “wild tasks” where large difffference between labeled training/source data and unseen test/target data exists. In particular, such difffference is often referred to as “domain gap”, and could cause signifificantly decreased performance which cannot be easily remedied by further increasing the representation power. Unsupervised domain adaptation (UDA) seeks to overcome such problem without target domain labels. In this paper, we propose a novel UDA framework based on an iterative self-training (ST) procedure, where the problem is formulated as latent variable loss minimization, and can be solved by alternatively generating pseudo labels on target data and re-training the model with these labels. On top of ST, we also propose a novel classbalanced self-training (CBST) framework to avoid the gradual dominance of large classes on pseudo-label generation, and introduce spatial priors to refifine generated labels. Comprehensive experiments show that the proposed methods achieve state of the art semantic segmentation performance under multiple major UDA settings

上一篇:Multimodal Unsupervised Image-to-Image Translation

下一篇:DF-Net: Unsupervised Joint Learning of Depth and Flow using Cross-Task Consistency

用户评价
全部评价

热门资源

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to learn...

    The move from hand-designed features to learned...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...