Abstract
Knowledge Graph (KG) embedding has become
crucial for the task of link prediction. Recent work
applies encoder-decoder models to tackle this problem, where an encoder is formulated as a graph
neural network (GNN) and a decoder is represented
by an embedding method. These approaches enforce embedding techniques with structure information. Unfortunately, existing GNN-based frameworks still confront 3 severe problems: low representational power, stacking in a flat way, and poor
robustness to noise. In this work, we propose a
novel multi-level graph neural network (M-GNN)
to address the above challenges. We first identify an
injective aggregate scheme and design a powerful
GNN layer using multi-layer perceptrons (MLPs).
Then, we define graph coarsening schemes for various kinds of relations, and stack GNN layers on
a series of coarsened graphs, so as to model hierarchical structures. Furthermore, attention mechanisms are adopted so that our approach can make
predictions accurately even on the noisy knowledge graph. Results on WN18 and FB15k datasets
show that our approach is effective in the standard
link prediction task, significantly and consistently
outperforming competitive baselines. Furthermore,
robustness analysis on FB15k-237 dataset demonstrates that our proposed M-GNN is highly robust
to sparsity and noise