资源论文Faster Principal Component Regression and Stable Matrix Chebyshev Approximation

Faster Principal Component Regression and Stable Matrix Chebyshev Approximation

2020-03-09 | |  65 |   36 |   0

Abstract

We solve principal component regression (PCR), up to a multiplicative accuracy 1+γ, by reducing the problem to 图片.png black-box calls of ridge regression. Therefore, our algorithm does not require any explicit construction of the top principal components, and is suitable for large-scale PCR instances. In contrast, previous result reequires 图片.png  such black-box calls. We obtain  this result by developing a general stable recurrence formula for matrix Chebyshev polynomials, and a degree-optimal polynomial approximation to the matrix sign function. Our techniques may be of independent interests, especially when designing iterative methods.

上一篇:Capacity Releasing Diffusion for Speed and Locality

下一篇:Re-revisiting Learning on Hypergraphs: Confidence Interval and Subgradient Method

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • Learning to learn...

    The move from hand-designed features to learned...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...