Knowledge-enhanced Hierarchical Attention for Community Question Answering
with Multi-task and Adaptive Learning
Abstract
In this paper, we propose a Knowledge-enhanced
Hierarchical Attention for community question answering with Multi-task learning and Adaptive
learning (KHAMA). First, we propose a hierarchical attention network to fully fuse knowledge from
input documents and knowledge base (KB) by exploiting the semantic compositionality of the input
sequences. The external factual knowledge helps
recognize background knowledge (entity mentions
and their relationships) and eliminate noise information from long documents that have sophisticated syntactic and semantic structures. In addition, we build multiple CQA models with adaptive
boosting and then combine these models to learn
a more effective and robust CQA system. Furthermore, KHAMA is a multi-task learning model. It
regards CQA as the primary task and question categorization as the auxiliary task, aiming at learning a
category-aware document encoder and enhance the
quality of identifying essential information from
long questions. Extensive experiments on two
benchmarks demonstrate that KHAMA achieves
substantial improvements over the compared methods