资源论文Stochastic Variance-Reduced Cubic Regularized Newton Methods

Stochastic Variance-Reduced Cubic Regularized Newton Methods

2020-03-16 | |  59 |   41 |   0

Abstract

We propose a stochastic variance-reduced cubic regularized Newton method (SVRC) for nonconvex optimization. At the core of our algorithm is a novel semi-stochastic gradient along with a semi-stochastic Hessian, which are specifically designed for cubic regularization method. We show that our p algorithm is guaranteed to converge to an (图片.png)-approximate local minimum e 图片.png) second-order oracle calls, within O(n which outperforms the state-of-the-art cubic regularization algorithms including subsampled cubic regularization. Our work also sheds light on the application of variance reduction technique to high-order non-convex optimization methods. Thorough experiments on various non-convex optimization problems support our theory.

上一篇:Meta-Learning by Adjusting Priors Based on Extended PAC-Bayes Theory

下一篇:PIPPS: Flexible Model-Based Policy Search Robust to the Curse of Chaos

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...