资源论文Gradient Weights help Nonparametric Regressors

Gradient Weights help Nonparametric Regressors

2020-01-13 | |  71 |   54 |   0

Abstract

In regression problems over 图片.png the unknown function f often varies more in some coordinates than in others. We show that weighting each coordinate i with the estimated norm of the ith derivative of f is an efficient way to significantly improve the performance of distance-based regressors, e.g. kernel and k-NN regressors. We propose a simple estimator of these derivative norms and prove its consistency. Moreover, the proposed estimator is efficiently learned online.

上一篇:Action-Model Based Multi-agent Plan Recognition

下一篇:Context-Sensitive Decision Forests for Object Detection

用户评价
全部评价

热门资源

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • dynamical system ...

    allows to preform manipulations of heavy or bul...

  • The Variational S...

    Unlike traditional images which do not offer in...