资源论文KNOWLEDGE CONSISTENCY BETWEEN NEURAL NET-WORKS

KNOWLEDGE CONSISTENCY BETWEEN NEURAL NET-WORKS

2020-01-02 | |  70 |   43 |   0

Abstract
This paper aims to analyze knowledge consistency between pre-trained deep neural networks. We propose a generic definition for knowledge consistency between neural networks at different fuzziness levels. A task-agnostic method is designed to disentangle feature components, which represent the consistent knowledge, from raw intermediate-layer features of each neural network. As a generic tool, our method can be broadly used for different applications. In preliminary experiments, we have used knowledge consistency as a tool to diagnose representations of neural networks. Knowledge consistency provides new insights to explain the success of existing deep-learning techniques, such as knowledge distillation and network compression. More crucially, knowledge consistency can also be used to refine pre-trained networks and boost performance. The code will be released when the paper is accepted.

上一篇:SEQUENTIAL LATENT KNOWLEDGE SELECTION FORK NOWLEDGE -G ROUNDED DIALOGUE

下一篇:UNDERSTANDING KNOWLEDGE DISTILLATION INN ON -AUTOREGRESSIVE MACHINE TRANSLATION

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...