资源论文Unsupervised Learning of PCFGs with Normalizing Flow

Unsupervised Learning of PCFGs with Normalizing Flow

2019-09-20 | |  103 |   58 |   0 0 0
Abstract Unsupervised PCFG inducers hypothesize sets of compact context-free rules as explanations for sentences. These models not only provide tools for low-resource languages, but also play an important role in modeling language acquisition (Bannard et al., 2009; Abend et al., 2017). However, current PCFG induction models, using word tokens as input, are unable to incorporate semantics and morphology into induction, and may encounter issues of sparse vocabulary when facing morphologically rich languages. This paper describes a neural PCFG inducer which employs context embeddings (Peters et al., 2018) in a normalizing flow model (Dinh et al., 2015) to extend PCFG induction to use semantic and morphological information1 . Linguistically motivated similarity penalty and categorical distance constraints are imposed on the inducer as regularization. Experiments show that the PCFG induction model with normalizing flow produces grammars with state-of-the-art accuracy on a variety of different languages. Ablation further shows a positive effect of normalizing flow, context embeddings and proposed regularizers.

上一篇:Uncovering Probabilistic Implications in Typological Knowledge Bases

下一篇:Using LSTMs to Assess the Obligatoriness of Phonological Distinctive Features for Phonotactic Learning

用户评价
全部评价

热门资源

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • The Variational S...

    Unlike traditional images which do not offer in...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...