资源论文Natasha 2: Faster Non-Convex Optimization Than SGD

Natasha 2: Faster Non-Convex Optimization Than SGD

2020-02-17 | |  60 |   36 |   0

Abstract 

We design a stochastic algorithm to find ε-approximate local minima of any smooth nonconvex function in rate image.png with only oracle access to stochastic gradients. The best result before this work was image.png by stochastic gradient descent (SGD).

上一篇:Large-Scale Stochastic Sampling from the Probability Simplex

下一篇:Inferring Latent Velocities from Weather Radar Data using Gaussian Processes

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • Learning to learn...

    The move from hand-designed features to learned...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...