资源论文An Asynchronous Parallel Stochastic Coordinate Descent Algorithm

An Asynchronous Parallel Stochastic Coordinate Descent Algorithm

2020-03-03 | |  56 |   38 |   0

Abstract

We describe an asynchronous parallel stochastic coordinate descent algorithm for minimizing smooth unconstrained or separably constrained functions. The method achieves a linear convergence rate on functions that satisfy an essential strong convexity property and a sublinear rate (1/K) on general convex functions. Nearlinear speedup on a multicore system can be expected if the number of processors is 图片.png in unconstrained optimization and 图片.png in the separable-constrained case, where n is the number of variables. We describe results from implementation on 40-core processors.

上一篇:Deep Boosting

下一篇:Densifying One Permutation Hashing via Rotation for Fast Near Neighbor Search

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • Learning to learn...

    The move from hand-designed features to learned...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...