资源论文An Efficient, Sparsity-Preserving, Online Algorithm for Low-Rank Approximation

An Efficient, Sparsity-Preserving, Online Algorithm for Low-Rank Approximation

2020-03-10 | |  60 |   35 |   0

Abstract

Low-rank matrix approximation is a fundamental tool in data analysis for processing large datasets reducing noise, and finding important signals. In this work, we present a novel truncated LU factorization called Spectrum-Revealing LU (SRLU) for effective low-rank matrix approximation, and develop a fast algorithm to compute an SRLU factorization. We provide both matrix and singular value approximation error bounds for the SRLU approximation computed by our algorithm. Our analysis suggests that SRLU is competitive with the best low-rank matrix approximation methods, deterministic or randomized, in both computational complexity and approximation quality. Numeric experiments illustrate that SRLU preserves sparsity, highlights important data features and variables, can be efficiently updated, and calculates data approximations nearly as accurately as possible. To the best of our knowledge this is the first practical varian of the LU factorization for effective and efficient low-rank matrix approximation.

上一篇:Dropout Inference in Bayesian Neural Networks with Alpha-divergences

下一篇:Recovery Guarantees for One-hidden-layer Neural Networks*

用户评价
全部评价

热门资源

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • The Variational S...

    Unlike traditional images which do not offer in...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...