资源论文Accelerated First-order Methods for Geodesically Convex Optimization on Riemannian Manifolds

Accelerated First-order Methods for Geodesically Convex Optimization on Riemannian Manifolds

2020-02-10 | |  68 |   44 |   0

Abstract

 In this paper, we propose an accelerated first-order method for geodesically convex optimization, which is the generalization of the standard Nesterov’s accelerated method from Euclidean space to nonlinear Riemannian space. We first derive two equations and obtain two nonlinear operators for geodesically convex optimization instead of the linear extrapolation step in Euclidean space. In particular, we analyze the global convergence properties of our accelerated method for geodesically strongly-convex problems, which show p that our method improves the convergence rate from image.png. Moreover, our method also improves the global convergence rate on geodesically general convex problems from image.pngto image.png. Finally, we give a specific iterative scheme for matrix Karcher mean problems, and validate our theoretical results with experiments.

上一篇:Differentiable Learning of Submodular Models

下一篇:Predictive-State Decoders: Encoding the Future into Recurrent Networks

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...