资源论文Discovering Discrete Latent Topics with Neural Variational Inference

Discovering Discrete Latent Topics with Neural Variational Inference

2020-03-10 | |  76 |   38 |   0

Abstract

Topic models have been widely explored as probabilistic generative models of documents. Traditional inference methods have sought closedform derivations for updating the models, however as the expressiveness of these models grows, so does the difficulty of performing fast and accurate inference over their parameters. This paper presents alternative neural approaches to topic modelling by providing parameterisable distributions over topics which permit training by backpropagation in the framework of neural variational inference. In addition, with the help of a stick-breaking construction, we propose a recurrent network that is able to discover a notionally unbounded number of topics, analogous to Bayesian non-parametric topic models. Experimental results on the MXM Song Lyrics, 20NewsGroups and Reuters News datasets demonstrate the effectiveness and efficiency of these neural topic models.

上一篇:Coordinated Multi-Agent Imitation Learning

下一篇:Guarantees for Greedy Maximization of Non-submodular Functions with Applications

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...