Abstract
We study parameter estimation for sparse nonlinear regression. More specifically, we assume the data are given by where f is nonlinear. To recover β* , we propose an regularized least-squares estimator. Unlike classical linear regression, the corresponding optimization problem is nonconvex because of the nonlinearity of f . In spite of the nonconvexity, we prov that under mild conditions, every stationary point of the objective enjoys an optimal statistical rate of convergence. Detailed numerical results are provided to back up our theory.