Recurrent Neural Network for Text Classification with Hierarchical MultiscaleDense Connections
Abstract
Text classification is a fundamental task in many
Natural Language Processing applications. While
recurrent neural networks have achieved great success in performing text classification, they fail to
capture the hierarchical structure and long-term semantics dependency which are common features
of text data. Inspired by the advent of the dense
connection pattern in advanced convolutional neural networks, we propose a simple yet effective
recurrent architecture, named Hierarchical Mutiscale Densely Connected RNNs (HM-DenseRNNs),
which: 1) enables direct access to the hidden states
of all preceding recurrent units via dense connections, and 2) organizes multiple densely connected recurrent units into a hierarchical multiscale structure, where the layers are updated at different scales. HM-DenseRNNs can effectively capture long-term dependencies among words in long
text data, and a dense recurrent block is further introduced to reduce the number of parameters and
enhance training efficiency. We evaluate the performance of our proposed architecture on three text
datasets and the results verify the advantages of
HM-DenseRNN over the baseline methods in terms
of the classification accuracy