资源论文O(log T ) Projections for Stochastic Optimization of Smooth and Strongly Convex Functions

O(log T ) Projections for Stochastic Optimization of Smooth and Strongly Convex Functions

2020-03-02 | |  58 |   59 |   0

Abstract

Traditional algorithms for stochastic optimization require projecting the solution at each iteration into a given domain to ensure its feasibility. When facing complex domains, such as the positive semidefinite cone, the projection operation can be expensive, leading to a high computational cost per iteration. In this paper, we present a novel algorithm that aims to reduce the number of projections for stochastic optimization. The proposed algorithm combines the strength of several recent developments in stochastic optimization, including mini-batches, extragradient, and epoch gradient descent, in order to effectively explore the smoothness and strong convexity. We show, both in expectation and with a high probability, that when the objective function is both smooth and strongly convex, the proposed algorithm achieves the optimal O(1/T ) rate of convergence with only O(log T ) projections. Our empirical study verifies the theoretical result.

上一篇:A non-IID Framework for Collaborative Filtering with Restricted Boltzmann Machines

下一篇:Learning and Selecting Features Jointly with Point-wise Gated Boltzmann Machines

用户评价
全部评价

热门资源

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • The Variational S...

    Unlike traditional images which do not offer in...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...