资源论文On the Power of Truncated SVD for General High-rank Matrix Estimation Problems

On the Power of Truncated SVD for General High-rank Matrix Estimation Problems

2020-02-10 | |  50 |   38 |   0

Abstract 

We show that given an estimate image.png that is close to a general high-rank positive semi? de?nite (PSD) matrix A in spectral norm (i.e.,image.png), the simple truncated Singular Value Decomposition of image.png produces a multiplicative approximation of A in Frobenius norm. This observation leads to many interesting results on general high-rank matrix estimation problems: 1. High-rank matrix completion: we show that it is possible to recover a general high-rank matrix A up to (1 + ε) relative error in Frobenius norm from partial observations, with sample complexity independent of the spectral gap of A. 2. High-rank matrix denoising: we design an algorithm that recovers a matrix A with error in Frobenius norm from its noise-perturbed observations, without assuming A is exactly low-rank. 3. Low-dimensional approximation of high-dimensional covariance: given N i.i.d. samples of dimension n from image.png we show that it is possible to approximate the covariance matrix A with relative error in Frobenius norm with image.png improving over classical covariance estimation results which requires image.png

上一篇:Langevin Dynamics with Continuous Tempering for Tranining Deep Neural Networks

下一篇:Consistent Multitask Learning with Nonlinear Output Relations

用户评价
全部评价

热门资源

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • The Variational S...

    Unlike traditional images which do not offer in...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...