资源论文Subspace Embedding and Linear Regression with Orlicz Norm

Subspace Embedding and Linear Regression with Orlicz Norm

2020-03-16 | |  61 |   46 |   0

Abstract

We consider a generalization of the classic linear regression problem to the case when the loss is an Orlicz norm. An Orlicz norm is parameterized by a non-negative convex function G :图片.png with G(0) = 0: the Orlicz norm of a vectorP 图片.png is defined 图片.png . We consider the cases where the function G(·) grows subquadratically. Our main result is based on a new oblivious embedding which embeds the column space of a given matrix 图片.png with Orlicz norm into a lower dimensional space with图片.png norm. Specifically, we show how to efficiently find an embedding matrix 图片.pngsuch that 图片.png 图片.png ·  . By applying this subspace embedding technique, we show an approximation algorithm for the regression problem min图片.png fac-tor. As a further application of our techniques, w show how to also use them to improve on the algorithm for the 图片.png low rank matrix approximation problem for 图片.png.

上一篇:Mix & Match – Agent Curricula for Reinforcement Learning

下一篇:Dynamical Isometry and a Mean Field Theory of RNNs: Gating Enables Signal Propagation in Recurrent Neural Networks

用户评价
全部评价

热门资源

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • The Variational S...

    Unlike traditional images which do not offer in...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...