Title
Compressive Statistical Learning with Random Feature Moments.
Abstract
We describe a general framework –compressive statistical learning– for resource-efficient large-scale learning: the training collection is compressed in one pass into a low-dimensional sketch (a vector of random empirical generalized moments) that captures the information relevant to the considered learning task. A near-minimizer of the risk is computed from the sketch through the solution of a nonlinear least squares problem. We investigate sufficient sketch sizes to control the generalization error of this procedure. The framework is illustrated on compressive clustering, compressive Gaussian mixture Modeling with fixed known variance, and compressive PCA.
Year
Venue
Field
2017
arXiv: Machine Learning
Dimensionality reduction,Pattern recognition,Mixture modeling,Gaussian,Generalization error,Statistical learning,Artificial intelligence,Non-linear least squares,Cluster analysis,Machine learning,Mathematics,Sketch
DocType
Volume
Citations 
Journal
abs/1706.07180
4
PageRank 
References 
Authors
0.41
38
4
Name
Order
Citations
PageRank
Rémi Gribonval1120783.59
Gilles Blanchard215519.47
Nicolas Keriven3213.74
Yann Traonmilin4263.22