资源论文Optimal rates for first-order stochastic convex optimization under Tsybakov noise condition

Optimal rates for first-order stochastic convex optimization under Tsybakov noise condition

2020-03-02 | |  95 |   40 |   0

Abstract

We focus on the problem of minimizing a convex function f over a convex set S given T queries to a stochastic first order oracle. We argue that the complexity of convex minimization is only determined by the rate of growth of the function around its minimizer 图片.png as quantified by a Tsybakov-like noise condition. Specifically, we prove that if f grows at least as fast as 图片.png around its minimum, for some k > 1, then the op-timal rate of learning 图片.png The classic rate 图片.png for convex functions and 图片.png for strongly convex functions are special cases of our result for 图片.png and k = 2, and even faster rates are attained for k < 2. We also derive tight bounds for the complexity of learning 图片.png where the optimal rate is 图片.png Interestingly, these precise rates for convex optimization also characterize the complexity of active learning and our results further strengthen the connections between the two fields, both of which rely on feedback-driven queries.

上一篇:Fast dropout training

下一篇:Riemannian Similarity Learning

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • Learning to learn...

    The move from hand-designed features to learned...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...