资源论文Faster Ridge Regression via the Subsampled Randomized Hadamard Transform

Faster Ridge Regression via the Subsampled Randomized Hadamard Transform

2020-01-16 | |  54 |   41 |   0

Abstract

We propose a fast algorithm for ridge regression when the number of features is much larger than the number of observations 图片.pngThe standard way to solve ridge regression in this setting works in the dual space and gives a running time of 图片.png Our algorithm Subsampled Randomized Hadamard TransformDual Ridge Regression (SRHT-DRR) runs in time O(np log(n)) and works by preconditioning the design matrix by a Randomized Walsh-Hadamard Transform with a subsequent subsampling of features. We provide risk bounds for our SRHT-DRR algorithm in the fixed design setting and show experimental results on synthetic and real datasets.

上一篇:Correlations strike back (again): the case of associative memory retrieval

下一篇:Learning Stochastic Inverses

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • Learning to learn...

    The move from hand-designed features to learned...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...