资源论文A no-regret generalization of hierarchical softmax to extreme multi-label classification

A no-regret generalization of hierarchical softmax to extreme multi-label classification

2020-02-13 | |  57 |   47 |   0

Abstract

 Extreme multi-label classification (XMLC) is a problem of tagging an instance with a small subset of relevant labels chosen from an extremely large pool of possible labels. Large label spaces can be efficiently handled by organizing labels as a tree, like in the hierarchical softmax (HSM) approach commonly used for multi-class problems. In this paper, we investigate probabilistic label trees (PLTs) that have been recently devised for tackling XMLC problems. We show that PLTs are a no-regret multi-label generalization of HSM when precision@k is used as a model evaluation metric. Critically, we prove that pick-one-label heuristic—a reduction technique from multi-label to multi-class that is routinely used along with HSM—is not consistent in general. We also show that our implementation of PLTs, referred to as EXTREME T EXT (XT), obtains significantly better results than HSM with the pick-one-label heuristic and XML-CNN, a deep network specifically designed for XMLC problems. Moreover, XT is competitive to many state-of-the-art approaches in terms of statistical performance, model size and prediction time which makes it amenable to deploy in an online system.

上一篇:Communication Compression for Decentralized Training

下一篇:Answerer in Questioner’s Mind: Information Theoretic Approach to Goal-Oriented Visual Dialog

用户评价
全部评价

热门资源

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to learn...

    The move from hand-designed features to learned...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...