资源论文Gradient Weights help Nonparametric Regressors

Gradient Weights help Nonparametric Regressors

2020-01-13 | |  66 |   49 |   0

Abstract

In regression problems over 图片.png the unknown function f often varies more in some coordinates than in others. We show that weighting each coordinate i with the estimated norm of the ith derivative of f is an efficient way to significantly improve the performance of distance-based regressors, e.g. kernel and k-NN regressors. We propose a simple estimator of these derivative norms and prove its consistency. Moreover, the proposed estimator is efficiently learned online.

上一篇:Action-Model Based Multi-agent Plan Recognition

下一篇:Context-Sensitive Decision Forests for Object Detection

用户评价
全部评价

热门资源

  • Learning to learn...

    The move from hand-designed features to learned...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • Rating-Boosted La...

    The performance of a recommendation system reli...

  • Hierarchical Task...

    We extend hierarchical task network planning wi...