We present a novel, simple and systematic con-vergence analysis of gradient descent for eigen-vector computation. As a popular, practical, and provable approach to numerous machine learning problems, gradient descent has found successful applications to eigenvector computation as well.However, surprisingly, it lacks a thorough the- oretical analysis for the underlying geodesically non-convex problem. In this work, the conver-gence of the gradient descent solver for the lead-ing eigenvector computation is shown to be at a global raterepresents the generalized positive eigengap and always exists without loss of general-ity witheigenvalue of the given real symmetric matrix and being the multi-We also show that the convergence only logarith-mically instead of quadratically depends on the ini-tial iterate. Particularly, this is the first time the lin-ear convergence for the case that the conventionally
considered eigengapgeneralized eigengapas well as the logarithmic dependence on the ini- tial iterate are established for the gradient descent solver. We are also the first to leverage for anal-ysis the log principal angle between the iterate and the space of globally optimal solutions. Theoretical properties are verified in experiments.