资源论文Cooperative neural networks (CoNN): Exploiting prior independence structure for improved classification

Cooperative neural networks (CoNN): Exploiting prior independence structure for improved classification

2020-02-18 | |  161 |   40 |   0

Abstract 

We propose a new approach, called cooperative neural networks (CoNN), which uses a set of cooperatively trained neural networks to capture latent representations that exploit prior given independence structure. The model is more flexible than traditional graphical models based on exponential family distributions, but incorporates more domain specific prior structure than traditional deep networks or variational autoencoders. The framework is very general and can be used to exploit the independence structure of any graphical model. We illustrate the technique by showing that we can transfer the independence structure of the popular Latent Dirichlet Allocation (LDA) model to a cooperative neural network, CoNNsLDA. Empirical evaluation of CoNN-sLDA on supervised text classification tasks demonstrates that the theoretical advantages of prior independence structure can be realized in practice we demonstrate a 23% reduction in error on the challenging MultiSent data set compared to state-of-the-art.

上一篇:Integrated accounts of behavioral and neuroimaging data using flexible recurrent neural network models

下一篇:Batch-Instance Normalization for Adaptively Style-Invariant Neural Networks

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • Learning to learn...

    The move from hand-designed features to learned...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...