资源论文Riemannian SVRG: Fast Stochastic Optimization on Riemannian Manifolds

Riemannian SVRG: Fast Stochastic Optimization on Riemannian Manifolds

2020-02-05 | |  52 |   37 |   0

Abstract 

We study optimization of finite sums of geodesically smooth functions on Riemannian manifolds. Although variance reduction techniques for optimizing finite-sums have witnessed tremendous attention in the recent years, existing work is limited to vector space problems. We introduce Riemannian SVRG (RSVRG), a new variance reduced Riemannian optimization method. We analyze R SVRG for both geodesically convex and nonconvex (smooth) functions. Our analysis reveals that R SVRG inherits advantages of the usual SVRG method, but with factors depending on curvature of the manifold that influence its convergence. To our knowledge, R SVRG is the first provably fast stochastic Riemannian method. Moreover, our paper presents the first non-asymptotic complexity analysis (novel even for the batch setting) for nonconvex Riemannian optimization. Our results have several implications; for instance, they offer a Riemannian perspective on variance reduced PCA, which promises a short, transparent convergence analysis.

上一篇:Collaborative Recurrent Autoencoder: Recommend while Learning to Fill in the Blanks

下一篇:Robustness of classifiers: from adversarial to random noise

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Joint Pose and Ex...

    Facial expression recognition (FER) is a challe...