资源论文DICOD: Distributed Convolutional Coordinate Descent for Convolutional Sparse Coding

DICOD: Distributed Convolutional Coordinate Descent for Convolutional Sparse Coding

2020-03-19 | |  55 |   40 |   0

Abstract

In this paper, we introduce DICOD, a convolutional sparse coding algorithm which builds shift invariant representations for long signals. This algorithm is designed to run in a distributed setting, with local message passing, making it communication efficient. It is based on coordinate descent and uses locally greedy updates which accelerate the resolution compared to greedy coordinate selection. We prove the convergence of this algorithm and highlight its computational speed-up which is super-linear in the number of cores used. We also provide empirical evidence for the acceleration properties of our algorithm compared to state-of-the-art methods.

上一篇:Can Deep Reinforcement Learning Solve Erdos-Selfridge-Spencer Games?

下一篇:Active Learning with Logged Data

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...