Learning Finite Beta-Liouville Mixture Models via Variational Bayes for Proportional Data Clustering
Abstract
During the past decade, ?nite mixture modeling has become a well-established technique in data analysis and clustering. This paper focus on developing a variational inference framework to learn ?nite Beta-Liouville mixture models that have been proposed recently as an ef?cient way for proportional data clustering. In contrast to the conventional expectation maximization (EM) algorithm, commonly used for learning ?nite mixture models, the proposed algorithm has the advantages that it is more ef?cient from a computational point of view and by preventing overand under-?tting problems. Moreover, the complexity of the mixture model (i.e. the number of components) can be determined automatically and simultaneously with the parameters estimation in a closed form as part of the Bayesian inference procedure. The merits of the proposed approach are shown using both arti?cial data sets and two interesting and challenging real applications namely dynamic textures clustering and facial expression recognition.