Abstract
Aspect-based sentiment analysis (ABSA) aims to
predict fine-grained sentiments of comments with
respect to given aspect terms or categories. In previous ABSA methods, the importance of aspect has
been realized and verified. Most existing LSTMbased models take aspect into account via the attention mechanism, where the attention weights are
calculated after the context is modeled in the form of contextual vectors. However, aspect-related
information may be already discarded and aspectirrelevant information may be retained in classic LSTM cells in the context modeling process, which
can be improved to generate more effective context
representations. This paper proposes a novel variant of LSTM, termed as aspect-aware LSTM (AALSTM), which incorporates aspect information into LSTM cells in the context modeling stage before the attention mechanism. Therefore, our AALSTM can dynamically produce aspect-aware contextual representations. We experiment with several representative LSTM-based models by replacing
the classic LSTM cells with the AA-LSTM cells. Experimental results on SemEval-2014 Datasets
demonstrate the effectiveness of AA-LSTM