Title
Efficient sparsification for Gaussian process regression.
Abstract
Sparse Gaussian process models provide an efficient way to perform regression on large data sets. Sparsification approaches deal with the selection of a representative subset of available training data for inducing the sparse model approximation. A variety of insertion and deletion criteria have been proposed, but they either lack accuracy or suffer from high computational costs. In this paper, we present a new and straightforward criterion for successive selection and deletion of training points in sparse Gaussian process regression. The proposed novel strategies for sparsification are as fast as the purely randomized schemes and, thus, appropriate for applications in online learning. Experiments on real-world robot data demonstrate that our obtained regression models are competitive with the computationally intensive state-of-the-art methods in terms of generalization and accuracy. Furthermore, we employ our approach in learning inverse dynamics models for compliant robot control using very large data sets, i.e. with half a million training points. In this experiment, it is also shown that our approximated sparse Gaussian process model is sufficiently fast for real-time prediction in robot control.
Year
DOI
Venue
2016
10.1016/j.neucom.2016.02.032
Neurocomputing
Keywords
Field
DocType
Gaussian processes,Sparse approximations,Greedy subset selection,Learning robot inverse dynamics
Kriging,Robot control,Data set,Regression,Pattern recognition,Regression analysis,Sparse approximation,Gaussian process,Artificial intelligence,Inverse dynamics,Machine learning,Mathematics
Journal
Volume
ISSN
Citations 
192
0925-2312
3
PageRank 
References 
Authors
0.48
8
3
Name
Order
Citations
PageRank
Jens Schreiter1211.83
duy nguyentuong243826.22
marc toussaint3129997.23