资源论文Deep Learning with Relational Logic Representations

Deep Learning with Relational Logic Representations

2019-10-11 | |  41 |   31 |   0
As the process of maximizing accuracies on large perceptual datasets slowly reaches its limits, deep learning researchers start to explore increasingly ambitious goals for neural networks to strive for, with thoroughly tuned neural architectures continuously taking over new domains on daily basis. Despite their significant success, all the existing neural architectures based on static computational graphs processing fixed tensor representations necessarily face fundamental limitations when presented with dynamically sized and structured data. Examples of these are sparse multi-relational structures present everywhere from biological networks and complex knowledge hyper-graphs to logical theories. Likewise, given the cryptic nature of generalization and representation learning in neural networks, potential integration with the sheer amounts of existing symbolic abstractions present in human knowledge remains highly problematic. Here, we argue that these abilities, naturally present in symbolic approaches based on the expressive power of relational logic, are necessary to be adopted for further progress of neural networks into the more complex domains

上一篇:Controllable Neural Story Plot Generation via Reward Shaping

下一篇:EL Embeddings: Geometric Construction of Models for the Description Logic EL++

用户评价
全部评价

热门资源

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • The Variational S...

    Unlike traditional images which do not offer in...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...