Title
Gradients Weights improve Regression and Classification.
Abstract
In regression problems over R-d, the unknown function f often varies more in some coordinates than in others. We show that weighting each coordinate i according to an estimate of the variation of f along coordinate i -e.g. the L-1 norm of the ith-directional derivative of f -is an efficient way to significantly improve the performance of distance-based regressors such as kernel and kappa-NN regressors. The approach, termed Gradient Weighting (GW), consists of a first pass regression estimate f(n) which serves to evaluate the directional derivatives of f, and a second-pass regression estimate on the re-weighted data. The GW approach can be instantiated for both regression and classification, and is grounded in strong theoretical principles having to do with the way regression bias and variance are affected by a generic feature-weighting scheme. These theoretical principles provide further technical foundation for some existing feature-weighting heuristics that have proved successful in practice. We propose a simple estimator of these derivative norms and prove its consistency. The proposed estimator computes efficiently and easily extends to run online. We then derive a classification version of the GW approach which evaluates on real-worlds datasets with as much success as its regression counterpart.
Year
Venue
Keywords
2016
JOURNAL OF MACHINE LEARNING RESEARCH
Nonparametric learning,feature selection,feature weighting,nonparametric sparsity,metric learning
Field
DocType
Volume
Kernel (linear algebra),Weighting,Feature selection,Regression,Nonparametric regression,Heuristics,Artificial intelligence,Directional derivative,Machine learning,Mathematics,Estimator
Journal
17
ISSN
Citations 
PageRank 
1532-4435
0
0.34
References 
Authors
0
4
Name
Order
Citations
PageRank
Samory Kpotufe19211.56
Boularias, Abdeslam210520.64
Thomas Schultz39714.95
Kyoungok Kim4344.30