Abstract
We propose a stochastic variance-reduced cubic regularized Newton method (SVRC) for nonconvex optimization. At the core of our algorithm is a novel semi-stochastic gradient along with a semi-stochastic Hessian, which are specifically designed for cubic regularization method. We show that our p algorithm is guaranteed to converge to an ()-approximate local minimum e ) second-order oracle calls, within O(n which outperforms the state-of-the-art cubic regularization algorithms including subsampled cubic regularization. Our work also sheds light on the application of variance reduction technique to high-order non-convex optimization methods. Thorough experiments on various non-convex optimization problems support our theory.