Title
Optimal Estimation of Derivatives in Nonparametric Regression.
Abstract
We propose a simple framework for estimating derivatives without fitting the regression function in nonparametric regression. Unlike most existing methods that use the symmetric difference quotients, our method is constructed as a linear combination of observations. It is hence very flexible and applicable to both interior and boundary points, including most existing methods as special cases of ours. Within this framework, we define the variance-minimizing estimators for any order derivative of the regression function with a fixed bias-reduction level. For the equidistant design, we derive the asymptotic variance and bias of these estimators. We also show that our new method will, for the first time, achieve the asymptotically optimal convergence rate for difference-based estimators. Finally, we provide an effective criterion for selection of tuning parameters and demonstrate the usefulness of the proposed method through extensive simulation studies of the first- and second-order derivative estimators.
Year
Venue
Keywords
2016
JOURNAL OF MACHINE LEARNING RESEARCH
Linear combination,Nonparametric derivative estimation,Nonparametric regression,Optimal sequence,Taylor expansion
Field
DocType
Volume
Linear combination,Mathematical optimization,Polynomial regression,Nonparametric regression,Optimal estimation,Rate of convergence,Delta method,Asymptotically optimal algorithm,Mathematics,Estimator
Journal
17
ISSN
Citations 
PageRank 
1532-4435
0
0.34
References 
Authors
0
3
Name
Order
Citations
PageRank
Wenlin Dai100.34
Tiejun Tong2277.70
Marc G. Genton316721.72