资源论文Sublinear Time Orthogonal Tensor Decomposition?

Sublinear Time Orthogonal Tensor Decomposition?

2020-02-05 | |  92 |   41 |   0

Abstract

 A recent work (Wang et. al., NIPS 2015) gives the fastest known algorithms for orthogonal tensor decomposition with provable guarantees. Their algorithm is based on computing sketches of the input tensor, which requires reading the entire input. We show in a number of cases one can achieve the same theoretical guarantees in sublinear time, i.e., even without reading most of the input tensor. Instead of using sketches to estimate inner products in tensor decomposition algorithms, we use importance sampling. To achieve sublinear time, we need to know the norms of tensor slices, and we P show how to do this in a number of important cases. For symmetric tensors image.pngfor all image.png, we estimate such norms in sublinear time whenever p is even. For the important case of p = 3 and small values of k, we can also estimate such norms. For asymmetric tensors sublinear time is not possible in general, but we show if the tensor slice norms are just slightly below k T kF then sublinear time is again possible. One of the main strengths of our work is empirical in a number of cases our algorithm is orders of magnitude faster than existing methods with the same accuracy.

上一篇:Optimal Learning for Multi-pass Stochastic Gradient Methods

下一篇:Active Learning from Imperfect Labelers

用户评价
全部评价

热门资源

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • The Variational S...

    Unlike traditional images which do not offer in...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...