Title
Data-Dependent Coresets for Compressing Neural Networks with Applications to Generalization Bounds.
Abstract
The deployment of state-of-the-art neural networks containing millions of parameters to resource-constrained platforms may be prohibitive in terms of both time and space. In this work, we present an efficient coresets-based neural network compression algorithm that provably sparsifies the parameters of a trained feedforward neural network in a manner that approximately preserves the networku0027s output. Our approach is based on an importance sampling scheme that judiciously defines a sampling distribution over the neural network parameters, and as a result, retains parameters of high importance while discarding redundant ones. Our method and analysis introduce an empirical notion of sensitivity and extend traditional coreset constructions to the application of compressing parameters. Our theoretical analysis establishes both instance-dependent and -independent bounds on the size of the resulting compressed neural network as a function of the user-specified tolerance and failure probability parameters. As a corollary to our practical compression algorithm, we obtain novel generalization bounds that may provide novel insights on the generalization properties of neural networks.
Year
Venue
Field
2018
international conference on learning representations
Sampling distribution,Data set,Importance sampling,Data dependent,Algorithm,Artificial intelligence,Artificial neural network,Data compression,Mathematics,Machine learning,Coreset
DocType
Volume
Citations 
Journal
abs/1804.05345
5
PageRank 
References 
Authors
0.40
33
5
Name
Order
Citations
PageRank
Cenk Baykal1114.93
Lucas Liebenwein251.42
Igor Gilitschenski37813.89
Dan Feldman427125.25
Daniela Rus57128657.33