资源论文CONVOLUTIONAL CONDITIONAL NEURAL PROCESSES

CONVOLUTIONAL CONDITIONAL NEURAL PROCESSES

2020-01-02 | |  96 |   55 |   0

Abstract

We introduce the Convolutional Conditional Neural Process (C ONV CNP), a new member of the Neural Process family that models translation equivariance in the data. Translation equivariance is an important inductive bias for many learning problems including time series modelling, spatial data, and images. The model embeds data sets into an infinite-dimensional function space as opposed to a finitedimensional vector space. To formalize this notion, we extend the theory of neural representations of sets to include functional representations, and demonstrate that any translation-equivariant embedding can be represented using a convolutional deep set. We evaluate C ONV CNPs in several settings, demonstrating that they achieve state-of-the-art performance compared to existing NPs. We demonstrate that building in translation equivariance enables zero-shot generalization to challenging, out-of-domain tasks.

上一篇:THE FUNCTION OF CONTEXTUALILLUSIONS RECURRENT NEURAL CIRCUITS FOR CONTOUR DETECTION

下一篇:BINARY DUO :R EDUCING GRADIENT MISMATCH IN BI -NARY ACTIVATION NETWORK BY COUPLING BINARYACTIVATIONS

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...