资源论文Smooth Primal-Dual Coordinate Descent Algorithms for Nonsmooth Convex Optimization

Smooth Primal-Dual Coordinate Descent Algorithms for Nonsmooth Convex Optimization

2020-02-10 | |  64 |   48 |   0

Abstract

 We propose a new randomized coordinate descent method for a convex optimization template with broad applications. Our analysis relies on a novel combination of four ideas applied to the primal-dual gap function: smoothing, acceleration, homotopy, and coordinate descent with non-uniform sampling. As a result, our method features the first convergence rate guarantees among the coordinate descent methods, that are the best-known under a variety of common structure assumptions on the template. We provide numerical evidence to support the theoretical results with a comparison to state-of-the-art algorithms.

上一篇:Task-based End-to-end Model Learning in Stochastic Optimization

下一篇:Matching neural paths: transfer from recognition to correspondence search

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...