资源论文Adaptive Feature Selection: Computationally Efficient Online Sparse Linear Regression under RIP

Adaptive Feature Selection: Computationally Efficient Online Sparse Linear Regression under RIP

2020-03-09 | |  55 |   36 |   0

Abstract

Online sparse linear regression is an online problem where an algorithm repeatedly chooses a subset of coordinates to observe in an adversarially chosen feature vector, makes a real-valued prediction, receives the true label, and incurs th squared loss. The goal is to design an online learning algorithm with sublinear regret to the best sparse linear predictor in hindsight. Without any assumptions, this problem is known to be computationally intractable. In this paper, we make the assumption that data matrix satisfies restricted isometry property, and show that this assumption leads to computationally efficient algorithms with sublinear regret for two variants of the problem. In the first variant, the true label generated according to a sparse linear model with additive Gaussian noise. In the second, the true label is chosen adversarially.

上一篇:Learning to Discover Cross-Domain Relations with Generative Adversarial Networks

下一篇:ChoiceRank: Identifying Preferences from Node Traffic in Networks

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...