资源论文SLICED CRAM E?R SYNAPTIC CONSOLIDATION FORP RESERVING DEEPLY LEARNED REPRESENTATIONS

SLICED CRAM E?R SYNAPTIC CONSOLIDATION FORP RESERVING DEEPLY LEARNED REPRESENTATIONS

2020-01-02 | |  78 |   49 |   0

Abstract

Deep neural networks suffer from the inability to preserve the learned data representation (i.e., catastrophic forgetting) in domains where the input data distribution is non-stationary, and it changes during training. Various selective synaptic plasticity approaches have been recently proposed to preserve network parameters, which are crucial for previously learned tasks while learning new tasks. We explore such selective synaptic plasticity approaches through a unifying lens of memory replay and show the close relationship between methods like Elastic Weight Consolidation (EWC) and Memory-Aware-Synapses (MAS). We then propose a fundamentally different class of preservation methods that aim at preserving the distribution of the network’s output at an arbitrary layer for previous tasks while learning a new one. We propose the sliced Crame?r distance as a suitable choice for such preservation and evaluate our Sliced Crame?r Preservation (SCP) algorithm through extensive empirical investigations on various network architectures in both supervised and unsupervised learning settings. We show that SCP consistently utilizes the learning capacity of the network better than online-EWC and MAS methods on various incremental learning tasks.

上一篇:META -DATASET: ADATASET OF DATASETS FORL EARNING TO LEARN FROM FEW EXAMPLES

下一篇:DISAGREEMENT-R EGULARIZED IMITATIONL EARNING

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...