资源论文Parallel Coordinate Descent for L1 -Regularized Loss Minimization

Parallel Coordinate Descent for L1 -Regularized Loss Minimization

2020-02-27 | |  55 |   42 |   0

Abstract

We propose Shotgun, a parallel coordinate descent algorithm for minimizing图片.png regularized losses. Though coordinate descent seems inherently sequential, we prove convergence bounds for Shotgun which predict linear speedups, up to a problemdependent limit. We present a comprehensive empirical study of Shotgun for Lasso and sparse logistic regression. Our theoretical predictions on the potential for parallelism closely match behavior on real data. Shotgun outperforms other published solvers on a range of large problems, proving to be one of the most scalable algorithms for 图片.png .

上一篇:Approximation Bounds for Inference using Cooperative Cuts

下一篇:On the Use of Variational Inference for Learning Discrete Graphical Models

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Joint Pose and Ex...

    Facial expression recognition (FER) is a challe...