资源论文Principal Component Projection and Regression in Nearly Linear Time through Asymmetric SVRG

Principal Component Projection and Regression in Nearly Linear Time through Asymmetric SVRG

2020-02-21 | |  40 |   43 |   0

Abstract

Given a data matrix A 图片.png , principal component projection (PCP) and principal component regression (PCR), i.e. projection and regression restricted to the top-eigenspace of A, are fundamental problems in machine learning, optimization, and numerical analysis. In this paper we provide the first algorithms that solve these problems in nearly linear time for fixed eigenvalue distribution and large n. This improves upon previous methods which have superlinear running times when both the number of top eigenvalues and inverse gap between eigenspaces is large. We achieve our results by applying rational polynomial approximations to reduce PCP and PCR to solving asymmetric linear systems which we solve by a variant of SVRG. We corroborate these findings with preliminary empirical experiments.

上一篇:Learning Auctions with Robust Incentive Guarantees

下一篇:Asymptotic Guarantees for Learning Generative Models with the Sliced-Wasserstein Distance

用户评价
全部评价

热门资源

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • The Variational S...

    Unlike traditional images which do not offer in...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...