资源论文A Superlinearly-Convergent Proximal Newton-Type Method for the Optimization of Finite Sums

A Superlinearly-Convergent Proximal Newton-Type Method for the Optimization of Finite Sums

2020-03-06 | |  61 |   37 |   0

Abstract

We consider the problem of optimizing the strongly convex sum of a finite number of convex functions. Standard algorithms for solving this problem in the class of incremental/stochastic methods have at most a linear convergence rate. We propose a new incremental method whose convergence rate is superlinear – the Newtontype incremental method (NIM). The idea of the method is to introduce a model of the objective with the same sum-of-functions structure and further update a single component of the model per iteration. We prove that NIM has a superlinear local convergence rate and linear global convergence rate. Experiments show that the method is very effective for problems with a large number of functions and a small number of variables.

上一篇:Low-rank tensor completion: a Riemannian manifold preconditioning approach

下一篇:No-Regret Algorithms for Heavy-Tailed Linear Bandits

用户评价
全部评价

热门资源

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • The Variational S...

    Unlike traditional images which do not offer in...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...