资源论文Stochastic Smoothing for Nonsmooth Minimizations: Accelerating SGD by Exploiting Structure

Stochastic Smoothing for Nonsmooth Minimizations: Accelerating SGD by Exploiting Structure

2020-02-28 | |  34 |   30 |   0

Abstract

In this work we consider the stochastic minimization of nonsmooth convex loss functions, a central problem in machine learning. We propose a novel algorithm called Accelerated Nonsmooth Stochastic Gradient Descent (ANSGD), which exploits the structure of common nonsmooth loss functions to achieve optimal convergence rates for a class of problems including SVMs. It is the first stochastic algorithm that can achieve the optimal O(1/t) rate for minimizing nonsmooth loss functions (with strong convexity). The fast rates are con?rmed by empirical comparisons, in which ANSGD signi?cantly outperforms previous subgradient descent algorithms including SGD.

上一篇:Quasi-Newton Methods: A New Direction

下一篇:Plug-in martingales for testing exchangeability on-line

用户评价
全部评价

热门资源

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to learn...

    The move from hand-designed features to learned...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...