资源论文Kernelized Synaptic Weight Matrices

Kernelized Synaptic Weight Matrices

2020-03-19 | |  138 |   48 |   0

Abstract

In this paper we introduce a novel neural network architecture, in which weight matrices are reparametrized in terms of low-dimensional vectors, interacting through kernel functions. A layer of our network can be interpreted as introducing a (potentially infinitely wide) linear layer be tween input and output. We describe the theory underpinning this model and validate it with concrete examples, exploring how it can be used to impose structure on neural networks in diverse applications ranging from data visualization to recommender systems. We achieve state-of-theart performance in a collaborative filtering task (MovieLens).

上一篇:Hierarchical Long-term Video Prediction without Supervision

下一篇:BOCK : Bayesian Optimization with Cylindrical Kernels

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • Learning to learn...

    The move from hand-designed features to learned...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...