资源论文D PP N ET: Approximating Determinantal Point Processes with Deep Networks

D PP N ET: Approximating Determinantal Point Processes with Deep Networks

2020-02-23 | |  45 |   36 |   0

Abstract

Determinantal point processes (D PPs) provide an elegant and versatile way to sample sets of items that balance the quality with the diversity of selected items. For this reason, they have gained prominence in many machine learning applications that rely on subset selection. However, sampling from a D PP over a ground set of size N is a costly operation, requiring in general an 图片.png  preprocessing cost and an 图片.png sampling cost for subsets of size k. We approach this problem by introducing D PP N ETs: generative deep models that produce D PP-like samples for arbitrary ground sets. We develop an inhibitive attention mechanism based on transformer networks that captures a notion of dissimilarity between feature vectors. We show theoretically that such an approximation is sensible as it maintains the guarantees of inhibition or dissimilarity that makes D PPs so powerful and unique. Empirically, we show across multiple datasets that D PP N ET is orders of magnitude faster than competing approaches for D PP sampling, while generating high-likelihood samples and performing as well as D PPs on downstream tasks.

上一篇:Locally Private Learning without Interaction Requires Separation

下一篇:HyperGCN: A New Method of Training Graph Convolutional Networks on Hypergraphs

用户评价
全部评价

热门资源

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to learn...

    The move from hand-designed features to learned...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...