资源论文Learning Finite Beta-Liouville Mixture Models via Variational Bayes for Proportional Data Clustering

Learning Finite Beta-Liouville Mixture Models via Variational Bayes for Proportional Data Clustering

2019-11-11 | |  72 |   46 |   0
Abstract During the past decade, ?nite mixture modeling has become a well-established technique in data analysis and clustering. This paper focus on developing a variational inference framework to learn ?nite Beta-Liouville mixture models that have been proposed recently as an ef?cient way for proportional data clustering. In contrast to the conventional expectation maximization (EM) algorithm, commonly used for learning ?nite mixture models, the proposed algorithm has the advantages that it is more ef?cient from a computational point of view and by preventing overand under-?tting problems. Moreover, the complexity of the mixture model (i.e. the number of components) can be determined automatically and simultaneously with the parameters estimation in a closed form as part of the Bayesian inference procedure. The merits of the proposed approach are shown using both arti?cial data sets and two interesting and challenging real applications namely dynamic textures clustering and facial expression recognition.

上一篇:Towards Robust Co-Clustering

下一篇:Active Learning for Level Set Estimation Alkis Gotovos Nathalie Casati Gregory Hitz Andreas Krause

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...