资源论文Learning Mixtures of Plackett-Luce Models

Learning Mixtures of Plackett-Luce Models

2020-03-06 | |  63 |   39 |   0

Abstract

In this paper we address the identifiability and efficient learning problems of finite mixtures of Plackett-Luce models for rank data. We prove that for any k 图片.png 2, the mixture of k PlackettLuce models for no more than 2k - 1 alternatives is non-identifiable and this bound is tight for k = 2. For generic identifiability, we prove that the mixture of k Plackett-Luce models over m alternatives is generically identifiable if 图片.png We also propose an efficient generalized method of moments (GMM) algorithm to learn the mixture of two Plackett-Luce models and show that the algorithm is consistent. Our experiments show that our GMM algorithm is significantly faster than the EMM algorithm by Gormley & Murphy (2008), while achieving competitive statistical efficiency.

上一篇:Auxiliary Deep Generative Models

下一篇:Variational Inference for Monte Carlo Objectives

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...