资源论文Scaling Factorial Hidden Markov Models: Stochastic Variational Inference without Messages

Scaling Factorial Hidden Markov Models: Stochastic Variational Inference without Messages

2020-02-05 | |  36 |   33 |   0

Abstract 

Factorial Hidden Markov Models (FHMMs) are powerful models for sequential data but they do not scale well with long sequences. We propose a scalable inference and learning algorithm for FHMMs that draws on ideas from the stochastic variational inference, neural network and copula literatures. Unlike existing approaches, the proposed algorithm requires no message passing procedure among latent variables and can be distributed to a network of computers to speed up learning. Our experiments corroborate that the proposed algorithm does not introduce further approximation bias compared to the proven structured mean-field algorithm, and achieves better performance with long sequences and large FHMMs.

上一篇:“Congruent” and “Opposite” Neurons: Sisters for Multisensory Integration and Segregation

下一篇:Towards Unifying Hamiltonian Monte Carlo and Slice Sampling

用户评价
全部评价

热门资源

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • The Variational S...

    Unlike traditional images which do not offer in...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...