资源论文GRADIENTLESS DESCENT: HIGH -D IMENSIONALZ EROTH -O RDER OPTIMIZATION

GRADIENTLESS DESCENT: HIGH -D IMENSIONALZ EROTH -O RDER OPTIMIZATION

2020-01-02 | |  69 |   44 |   0

Abstract

Zeroth-order optimization is the process of minimizing an objective f (x), given oracle access to evaluations at adaptively chosen inputs x. In this paper, we present two simple yet powerful GradientLess Descent (GLD) algorithms that do not rely on an underlying gradient estimate and are numerically stable. We analyze our algorithm from a novel geometric perspective and present a novel analysis that shows convergence within an 图片.png -ball of the optimum in O(kQ log(n) log(R/图片.png)) evaluations, for any monotone transform of a smooth and strongly convex objective with latent dimension k < n, where the input dimension is n, R is the diameter of the input space and Q is the condition number. Our rates are the first of its kind to be both 1) poly-logarithmically dependent on dimensionality and 2) invariant under monotone transformations. We further leverage our geometric perspective to show that our analysis is optimal. Both monotone invariance and its ability to utilize a low latent dimensionality are key to the empirical success of our algorithms, as demonstrated on BBOB and MuJoCo benchmarks.

上一篇:RAPID LEARNING OR FEATURE REUSE ?T OWARDSU NDERSTANDING THE EFFECTIVENESS OF MAML

下一篇:IDENTIFYING THROUGH FLOWSFOR RECOVERING LATENT REPRESENTATIONS

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...