资源论文Third-order Smoothness Helps: Faster Stochastic Optimization Algorithms for Finding Local Minima

Third-order Smoothness Helps: Faster Stochastic Optimization Algorithms for Finding Local Minima

2020-02-14 | |  81 |   63 |   0

Abstract 

We propose stochastic optimization algorithms that can find local minima faster than existing algorithms for nonconvex optimization problems, by exploiting the third-order smoothness to escape non-degenerate saddle points more efficiently. More specifically, the proposed algorithm only needs image.png stochastic gradient evaluations to converge to an approximate p local minimum x, which satisfies image.png and image.png in unconstrained stochastic optimiza-tion,where image.png hides logarithm polynomial terms and constants. This improvesupon the image.png gradient complexity achieved by the state-of-the-art stochastic local minima finding algorithms by a factor of image.png Experiments on two nonconvex optimization problems demonstrate the effectiveness of our algorithm and corroborate our theory.

上一篇:Preference Based Adaptation for Learning Objectives

下一篇:With Friends Like These, Who Needs Adversaries?

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • Learning to learn...

    The move from hand-designed features to learned...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...