资源论文Noisy Softmax: Improving the Generalization Ability of DCNN via Postponing the Early Softmax Saturation

Noisy Softmax: Improving the Generalization Ability of DCNN via Postponing the Early Softmax Saturation

2019-12-04 | |  56 |   49 |   0

Abstract

Over the past few years, softmax and SGD have become a commonly used component and the default training strategy in CNN frameworks, respectively. However, when optimizing CNNs with SGD, the saturation behavior behind softmax always gives us an illusion of training well and then is omitted. In this paper, we fifirst emphasize that the early saturation behavior of softmax will impede the exploration of SGD, which sometimes is a reason for model converging at a bad local-minima, then propose Noisy Softmax to mitigating this early saturation issue by injecting annealed noise in softmax during each iteration. This operation based on noise injection aims at postponing the early saturation and further bringing continuous gradients propagation so as to signifificantly encourage SGD solver to be more exploratory and help to fifind a better local-minima. This paper empirically verififies the superiority of the early softmax desaturation, and our method indeed improves the generalization ability of CNN model by regularization. We experimentally fifind that this early desaturation helps optimization in many tasks, yielding state-of-the-art or competitive results on several popular benchmark datasets.

上一篇:Noise Robust Depth from Focus using a Ring Difference Filter

下一篇:Non-contact full field vibration measurement based on phase-shifting

用户评价
全部评价

热门资源

  • Learning to learn...

    The move from hand-designed features to learned...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • Rating-Boosted La...

    The performance of a recommendation system reli...

  • Hierarchical Task...

    We extend hierarchical task network planning wi...