Title
Linear Contour Learning: A Method for Supervised Dimension Reduction.
Abstract
We propose a novel approach to sufficient dimension reduction in regression, based on estimating contour directions of negligible variation for the response surface. These directions span the orthogonal complement of the minimal space relevant for the regression, and can be extracted according to a measure of the variation in the response, leading to General Contour Regression (GCR). In comparison to existing sufficient dimension reduction techniques, this contour-based methodology guarantees exhaustive estimation of the central space under ellipticity of the predictor distribution and very mild additional assumptions, while maintaining √n-consistency and computational ease. Moreover, it proves to be robust to departures from ellipticity. We also establish some useful population properties for GCR. Simulations to compare performance with that of standard techniques such as ordinary least squares, sliced inverse regression, principal hessian directions, and sliced average variance estimation confirm the advantages anticipated by theoretical analyses. We also demonstrate the use of contour-based methods on a data set concerning grades of students from Massachusetts colleges.
Year
Venue
Keywords
2004
uncertainty in artificial intelligence
response surface,linear contour learning,sufficient dimension reduction,minimal space,sliced inverse regression,contour-based method,sliced average variance estimation,exhaustive estimation,supervised dimension reduction,contour-based methodology,negligible variation,central space,dimension reduction
DocType
Volume
ISBN
Conference
abs/1408.3359
0-9749039-0-6
Citations 
PageRank 
References 
0
0.34
0
Authors
3
Name
Order
Citations
PageRank
Bing Li152.33
Hongyuan Zha26703422.09
Francesca Chiaromonte353.03