资源论文CO -ATTENTIVE EQUIVARIANT NEURAL NETWORKS :F OCUSING EQUIVARIANCE ON TRANSFORMATIONSC O-O CCURRING IN DATA

CO -ATTENTIVE EQUIVARIANT NEURAL NETWORKS :F OCUSING EQUIVARIANCE ON TRANSFORMATIONSC O-O CCURRING IN DATA

2019-12-30 | |  71 |   49 |   0

Abstract

Equivariance is a nice property to have as it produces much more parameter efficient neural architectures and preserves the structure of the input through the feature mapping. Even though some combinations of transformations might never appear (e.g. an upright face with a horizontal nose), current equivariant architectures consider the set of all possible transformations in a transformation group when learning feature representations. Contrarily, the human visual system is able to attend to the set of relevant transformations occurring in the environment and utilizes this information to assist and improve object recognition. Based on this observation, we modify conventional equivariant feature mappings such that they are able to attend to the set of co-occurring transformations in data and generalize this notion to act on groups consisting of multiple symmetries. We show that our proposed co-attentive equivariant neural networks consistently outperform conventional rotation equivariant and rotation & reflection equivariant neural networks on rotated MNIST and CIFAR-10.

上一篇:WHAT GRAPH NEURAL NETWORKS CANNOT LEARN :D EPTH VS WIDTH

下一篇:AFAIR COMPARISON OF GRAPH NEURAL NETWORKSFOR GRAPH CLASSIFICATION

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • Learning to learn...

    The move from hand-designed features to learned...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...