Abstract
We revisit the classic randomized sketch of a tensor product of q vectors The i-th coordinate of the sketch is equal to where are independent random sign vectors. Kar and Karnick (JMLR, 2012) show that if the sketching dimension where is a certain property of the point set Ω one wants to sketch, then with probability However, in their analysis can be as large as even for a set Ω of O(1) vectors x. We give a new analysis of this sketch, providing nearly optimal bounds. Namely, we show an upper bound of which by composing with CountSketch, can be improved to + For the important case of this shows that demonstrating that the and terms do not multiply each other. We also show a nearly matching lower bound of In a number of applications, one has |Ω| = poly(n) and in this case our bounds are optimal up to a constant factor. This is the first high probability sketch Pfor tensor products that has optimal q sketch size and can be implemented in time, where nnz() is the number of non-zero entries of . Lastly, we empirically compare our sketch to other sketches for tensor products, and give a novel application to compressing neural networks.