Abstract
Previous cross-lingual knowledge graph (KG)
alignment studies rely on entity embeddings
derived only from monolingual KG structural
information, which may fail at matching entities that have different facts in two KGs. In
this paper, we introduce the topic entity graph,
a local sub-graph of an entity, to represent entities with their contextual information in KG.
From this view, the KB-alignment task can be
formulated as a graph matching problem; and
we further propose a graph-attention based solution, which first matches all entities in two
topic entity graphs, and then jointly model the
local matching information to derive a graphlevel matching vector. Experiments show that
our model outperforms previous state-of-theart methods by a large margin