Abstract
We propose a simple, scalable, and fast gradient descent algorithm to optimize a nonconvex objective for the rank minimization problem and a closely related family of semidefinite programs. With random measurements of a positive semidefinite n×n matrix of rank r and condition number , our method is guaranteed to converge linearly to the global optimum.