资源论文ROBUST TRAINING WITH ENSEMBLE CONSENSUS

ROBUST TRAINING WITH ENSEMBLE CONSENSUS

2020-01-02 | |  64 |   37 |   0

Abstract

Since deep neural networks are over-parametrized, they may memorize noisy examples. We address such memorizing issue in the presence of annotation noise. From the fact that deep neural networks cannot generalize neighborhoods of the features acquired via memorization, we observe that noisy examples do not consistently incur small losses on the network under a certain perturbation. Based on this observation, we propose a novel training method called Learning with Ensemble Consensus (LEC) that prevents overfitting noisy examples by eliminating them using the consensus of an ensemble of perturbed networks. One of the proposed LECs, LTEC outperforms the current state-of-the-art methods on MNIST, CIFAR-10, and CIFAR-100 with efficient memory usage.

上一篇:TRANQUIL CLOUDS :N EURAL NETWORKS FORL EARNING TEMPORALLY COHERENT FEATURES INP OINT CLOUDS

下一篇:BLACK -B OX ADVERSARIAL ATTACK WITH TRANS -FERABLE MODEL -BASED EMBEDDING

用户评价
全部评价

热门资源

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to learn...

    The move from hand-designed features to learned...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...