We study the least squares regression problem where is a low-rank tensor, defined as , for vectors . Here, denotes the outer product of vectors, and A() is a linear function on . This problem is motivated by the fact PD that the number of parameters in is only , which is significantly smaller than the number of parameters in ordinary least squares regression. We consider the above CP decomposition model of tensors , as well as the Tucker decomposition. For both models we show how to apply data dimensionality reduction techniques based on sparse random projections , to reduce the problem to a much smaller problem min ,for which holds simultaneously for all . We obtain a significantly smaller dimension and sparsity in the randomized linear mapping than is possible for ordinary least squares regression. Finally, we give a number of numerical simulations supporting our theory.