资源论文Nonconvex Relaxation Approaches to Robust Matrix Recovery Shusen Wang and Dehua Liu and Zhihua Zhang

Nonconvex Relaxation Approaches to Robust Matrix Recovery Shusen Wang and Dehua Liu and Zhihua Zhang

2019-11-11 | |  60 |   39 |   0
Abstract Motivated by the recent developments of nonconvex penalties in sparsity modeling, we propose a nonconvex optimization model for handing the low-rank matrix recovery problem. Different from the famous robust principal component analysis (RPCA), we suggest recovering low-rank and sparse matrices via a nonconvex loss function and a nonconvex penalty. The advantage of the nonconvex approach lies in its stronger robustness. To solve the model, we devise a majorization-minimization augmented Lagrange multiplier (MM-ALM) algorithm which ?nds the local optimal solutions of the proposed nonconvex model. We also provide an ef?cient strategy to speedup MM-ALM, which makes the running time comparable with the state-of-the-art algorithm of solving RPCA. Finally, empirical results demonstrate the superiority of our nonconvex approach over RPCA in terms of matrix recovery accuracy.

上一篇:Online Group Feature Selection

下一篇:A KNN Based Kalman Filter Gaussian Process Regression Yali Wang and Brahim Chaib-draa

用户评价
全部评价

热门资源

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • The Variational S...

    Unlike traditional images which do not offer in...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...