资源论文COHERENT GRADIENTS :A NA PPROACH TOU NDERSTANDING GENERALIZATIONIN GRADIENT DESCENT- BASED OPTIMIZATION

COHERENT GRADIENTS :A NA PPROACH TOU NDERSTANDING GENERALIZATIONIN GRADIENT DESCENT- BASED OPTIMIZATION

2020-01-02 | |  47 |   46 |   0

Abstract

An open question in the Deep Learning community is why neural networks trained with Gradient Descent generalize well on real datasets even though they are capable of fitting random data. We propose an approach to answering this question based on a hypothesis about the dynamics of gradient descent that we call Coherent Gradients: Gradients from similar examples are similar and so the overall gradient is stronger in certain directions where these reinforce each other. Thus changes to the network parameters during training are biased towards those that (locally) simultaneously benefit many examples when such similarity exists. We support this hypothesis with heuristic arguments and perturbative experiments and outline how this can explain several common empirical observations about Deep Learning. Furthermore, our analysis is not just descriptive, but prescriptive. It suggests a natural modification to gradient descent that can greatly reduce overfitting.

上一篇:CURRICULUM LOSS :ROBUST LEARNING AND GEN -ERALIZATION AGAINST LABEL CORRUPTION

下一篇:CONTINUAL LEARNING WITH ADAPTIVE WEIGHTS(CLAW)

用户评价
全部评价

热门资源

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • The Variational S...

    Unlike traditional images which do not offer in...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...