资源论文LEARNING FROM EXPLANATIONS WITHN EURAL MODULE EXECUTION TREE

LEARNING FROM EXPLANATIONS WITHN EURAL MODULE EXECUTION TREE

2020-01-02 | |  89 |   55 |   0

Abstract

While deep neural networks have achieved impressive performance on a range of NLP tasks, these data-hungry models heavily rely on labeled data. To make the most of each example, previous work has introduced natural language (NL) explanations to serve as supplements to mere labels. Such NL explanations can provide sufficient domain knowledge for generating more labeled data over new instances, while the annotation time only doubles. However, directly applying the NL explanations for augmenting model learning encounters two challenges. First, NL explanations are unstructured and inherently compositional, which asks for a modularized model to represent their semantics. Second, NL explanations often have large numbers of linguistic variants, resulting in low recall and limited generalization ability when applied to unlabeled data. In this paper, we propose a novel Neural Modular Execution Tree (NMET) framework for augmenting sequence classification with NL explanations. First, we transform NL explanations into executable logical forms using a semantic parser. NMET then employs a neural module network architecture to generalize the different types of actions (specified by the logical forms) for labeling data instances and accumulates the results with soft logic, which substantially increases the coverage of each NL explanation. Experiments on two NLP tasks, relation extraction, and sentiment analysis, demonstrate its superiority over baseline methods by leveraging NL explanation. Its extension to multi-hop question answering achieves performance gain with light annotation effort. Also, NMET achieves much better performance compared to traditional label-only supervised models in the same annotation time.

上一篇:POLYLOGARITHMIC WIDTH SUFFICES FOR GRADIENTDESCENT TO ACHIEVE ARBITRARILY SMALL TEST ER -ROR WITH SHALLOW RELU NETWORKS

下一篇:DISTRIBUTED BANDIT LEARNING :N EAR -O PTIMALR EGRET WITH EFFICIENT COMMUNICATION

用户评价
全部评价

热门资源

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • The Variational S...

    Unlike traditional images which do not offer in...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...