资源论文Statistical Parsing with Probabilistic Symbol-Re?ned Tree Substitution Grammars

Statistical Parsing with Probabilistic Symbol-Re?ned Tree Substitution Grammars

2019-11-08 | |  105 |   40 |   0

Abstract We present probabilistic Symbol-Re?ned Tree Substitution Grammars (SR-TSG) for statistical parsing of natural language sentences. An SR-TSG is an extension of the conventional TSG model where each nonterminal symbol can be re?ned (subcategorized) to ?t the training data. Our probabilistic model is consistent based on the hierarchical Pitman-Yor Process to encode backoff smoothing from a ?ne-grained SR-TSG to simpler CFG rules, thus all grammar rules can be learned from training data in a fully automatic fashion. Our SR-TSG parser achieves the state-of-the-art performance on the Wall Street Journal (WSJ) English Penn Treebank data.

上一篇:A Case-Based Solution to the Cold-Start Problem in Group Recommenders? Lara Quijano-Sa?nchez Derek Bridge

下一篇:The RoboEarth Language: Representing and Exchanging Knowledge about

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...