资源论文Fast and Memory Optimal Low-Rank Matrix Approximation

Fast and Memory Optimal Low-Rank Matrix Approximation

2020-02-04 | |  124 |   65 |   0

Abstract

 In this paper, we revisit the problem of constructing a near-optimal rank k approximation of a matrix image.png under the streaming data model where the columns of M are revealed sequentially. We present SLA (Streaming Low-rank Approximation),  an algorithm that is asymptotically accurate, when image.pngimage.png-th largest singular value of M . This means that its average mean-square error converges to 0 as m and n grow large image.png with high probability, where M? (image.png denote the output of SLA and the optimal rank k approximation of M , respectively). Our algorithm makes one pass on the data if the columns of M are revealed in a random order, and two passes if the columns of M arrive in an arbitrary order. To reduce its memory footprint and complexity, SLA uses random sparsification, and samples each entry of M with a small probability image.png. In turn, SLA is memory optimal as its required memory space scales as k(m+n), the dimension of its output. Furthermore, SLA is computationally efficient as it runs in image.png time (a constant number of operations is made for each observed entry of M ), which can be as small as image.png for an appropriate choice of image.png

上一篇:Collaborative Filtering with Graph Information: Consistency and Scalable Methods

下一篇:Streaming Min-Max Hypergraph Partitioning

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • Learning to learn...

    The move from hand-designed features to learned...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...