资源论文A Proximal-Gradient Homotopy Method for the `1 -Regularized Least-Squares Problem

A Proximal-Gradient Homotopy Method for the `1 -Regularized Least-Squares Problem

2020-02-28 | |  68 |   34 |   0

Abstract

We consider the 图片.png-regularized least-squares problem for sparse recovery and compressed sensing. Since the objective function is not strongly convex, standard proximal gradient methods only achieve sublinear convergence. We propose a homotopy continuation strategy, which employs a proximal gradient method to solve the problem with a sequence of decreasing regularization parameters. It is shown that under common assumptions in compressed sensing, the proposed method ensures that all iterates along the homotopy solution path are sparse, and the objective function is effectively strongly convex along the solution path. This observation allows us to obtain a global geometric convergence rate for the procedure. Empirical results are presented to support our theoretical analysis.

上一篇:Estimating Sparse Precision Matrices from Data with Missing Values

下一篇:Similarity Learning for Provably Accurate Sparse Linear Classification

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...