资源论文Unsupervised Word and Dependency Path Embeddings for Aspect Term Extraction

Unsupervised Word and Dependency Path Embeddings for Aspect Term Extraction

2019-11-27 | |  79 |   45 |   0

Abstract
In this paper, we develop a novel approach to aspect term extraction based on unsupervised learning of distributed representations of words and dependency paths. The basic idea is to connect two words (w1 and w2 ) with the dependency path (r) between them in the embedding space. Specifically, our method optimizes the objective w1 + r ⇡ w2 in the low-dimensional space, where the multihop dependency paths are treated as a sequence of grammatical relations and modeled by a recurrent neural network. Then, we design the embedding features that consider linear context and dependency context information, for the conditional random field (CRF) based aspect term extraction. Experimental results on the SemEval datasets show that, (1) with only embedding features, we can achieve state-of-the-art results; (2) our embedding method which incorporates the syntactic information among words yields better performance than other representative ones in aspect term extraction.

上一篇:Unsupervised Learning on Neural Network Outputs: with Application in Zero-Shot Learning

下一篇:Unsupervised Storyline Extraction from News Articles

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Joint Pose and Ex...

    Facial expression recognition (FER) is a challe...