资源论文Cuckoo Feature Hashing: Dynamic Weight Sharing for Sparse Analytics

Cuckoo Feature Hashing: Dynamic Weight Sharing for Sparse Analytics

2019-11-05 | |  56 |   40 |   0
Abstract Feature hashing is widely used to process large scale sparse features for learning of predictive models. Collisions inherently happen in the hashing process and hurt the model performance. In this paper, we develop a new feature hashing scheme called Cuckoo Feature Hashing (CCFH), which treats feature hashing as a problem of dynamic weight sharing during model training. By leveraging a set of indicators to dynamically decide the weight of each feature based on alternative hash locations, CCFH effectively prevents the collisions between important features to the model, i.e. predictive features, and thus avoid model performance degradation. Experimental results on prediction tasks with hundred-millions of features demonstrate that CCFH can achieve the same level of performance by using only 15%-25% parameters compared with conventional feature hashing.

上一篇:Stochastic Second-Order Method for Large-Scale Nonconvex Sparse Learning Models

下一篇:Scalable Rule Learning via Learning Representation

用户评价
全部评价

热门资源

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • The Variational S...

    Unlike traditional images which do not offer in...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...