We show that given an estimate that is close to a general high-rank positive semi? de?nite (PSD) matrix A in spectral norm (i.e.,), the simple truncated Singular Value Decomposition of produces a multiplicative approximation of A in Frobenius norm. This observation leads to many interesting results on general high-rank matrix estimation problems: 1. High-rank matrix completion: we show that it is possible to recover a general high-rank matrix A up to (1 + ε) relative error in Frobenius norm from partial observations, with sample complexity independent of the spectral gap of A. 2. High-rank matrix denoising: we design an algorithm that recovers a matrix A with error in Frobenius norm from its noise-perturbed observations, without assuming A is exactly low-rank. 3. Low-dimensional approximation of high-dimensional covariance: given N i.i.d. samples of dimension n from we show that it is possible to approximate the covariance matrix A with relative error in Frobenius norm with improving over classical covariance estimation results which requires