资源论文EL Embeddings: Geometric Construction of Models for the Description Logic EL++

EL Embeddings: Geometric Construction of Models for the Description Logic EL++

2019-10-11 | |  35 |   37 |   0
Abstract An embedding is a function that maps entities from one algebraic structure into another while preserving certain characteristics. Embeddings are being used successfully for mapping relational data or text into vector spaces where they can be used for machine learning, similarity search, or similar tasks. We address the problem of finding vector space embeddings for theories in the Description Logic EL++that are also models of the TBox. To find such embeddings, we define an optimization problem that characterizes the model-theoretic semantics of the operators in EL++within Rn, thereby solving the problem of finding an interpretation function for an EL++theory given a particular domain ?. Our approach is mainly relevant to large EL++theories and knowledge bases such as the ontologies and knowledge graphs used in the life sciences. We demonstrate that our method can be used for improved prediction of protein–protein interactions when compared to semantic similarity measures or knowledge graph embeddings

上一篇:Deep Learning with Relational Logic Representations

下一篇:Finding Justifications by Approximating Core for Large-scale Ontologies

用户评价
全部评价

热门资源

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • The Variational S...

    Unlike traditional images which do not offer in...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...