We consider a generalization of the classic linear regression problem to the case when the loss is an Orlicz norm. An Orlicz norm is parameterized by a non-negative convex function G : with G(0) = 0: the Orlicz norm of a vectorP is defined . We consider the cases where the function G(·) grows subquadratically. Our main result is based on a new oblivious embedding which embeds the column space of a given matrix with Orlicz norm into a lower dimensional space with norm. Specifically, we show how to efficiently find an embedding matrix such that · . By applying this subspace embedding technique, we show an approximation algorithm for the regression problem min fac-tor. As a further application of our techniques, w show how to also use them to improve on the algorithm for the low rank matrix approximation problem for .