资源论文Communication-Efficient Distributed Dual Coordinate Ascent

Communication-Efficient Distributed Dual Coordinate Ascent

2020-01-19 | |  68 |   48 |   0

Abstract

Communication remains the most significant bottleneck in the performance of distributed optimization algorithms for large-scale machine learning. In this paper, we propose a communication-efficient framework, C O C OA, that uses local computation in a primal-dual setting to dramatically reduce the amount of necessary communication. We provide a strong convergence rate analysis for this class of algorithms, as well as experiments on real-world distributed datasets with implementations in Spark. In our experiments, we find that as compared to stateof-the-art mini-batch versions of SGD and SDCA algorithms, C O C OA converges to the same .001-accurate solution quality on average 25× as quickly.

上一篇:Learning Mixed Multinomial Logit Model from Ordinal Data

下一篇:On Iterative Hard Thresholding Methods for High-dimensional M-Estimation

用户评价
全部评价

热门资源

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • The Variational S...

    Unlike traditional images which do not offer in...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...