资源论文Kronecker Determinantal Point Processes

Kronecker Determinantal Point Processes

2020-02-05 | |  70 |   43 |   0

Abstract 

Determinantal Point Processes (DPPs) are probabilistic models over all subsets a ground set of N items. They have recently gained prominence in several applications that rely on “diverse” subsets. However, their applicability to large problems is still limited due to image.png complexity of core tasks such as sampling and learning. We enable efficient sampling and learning for DPPs by introducing K RON D PP, a DPP model whose kernel matrix decomposes as a tensor product of multiple smaller kernel matrices. This decomposition immediately enables fast exact sampling. But contrary to what one may expect, leveraging the Kronecker product structure for speeding up DPP learning turns out to be more difficult. We overcome this challenge, and derive batch and stochastic optimization algorithms for efficiently learning the parameters of a image.png

上一篇:Dual Decomposed Learning with Factorwise Oracles for Structural SVMs of Large Output Domain

下一篇:On Valid Optimal Assignment Kernels and Applications to Graph Classification

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...