资源论文Gradient Weights help Nonparametric Regressors

Gradient Weights help Nonparametric Regressors

2020-01-13 | |  110 |   96 |   0

Abstract

In regression problems over 图片.png the unknown function f often varies more in some coordinates than in others. We show that weighting each coordinate i with the estimated norm of the ith derivative of f is an efficient way to significantly improve the performance of distance-based regressors, e.g. kernel and k-NN regressors. We propose a simple estimator of these derivative norms and prove its consistency. Moreover, the proposed estimator is efficiently learned online.

上一篇:Action-Model Based Multi-agent Plan Recognition

下一篇:Context-Sensitive Decision Forests for Object Detection

用户评价
全部评价

热门资源

  • Regularizing RNNs...

    Recently, caption generation with an encoder-de...

  • Deep Cross-media ...

    Cross-media retrieval is a research hotspot in ...

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Supervised Descen...

    Many computer vision problems (e.

  • Learning Expressi...

    Facial expression is temporally dynamic event w...