Abstract | ||
---|---|---|
The convolutional layers are core building blocks of neural network architectures. In general, a convolutional filter applies to the entire frequency spectrum of the input data. We explore artificially constraining the frequency spectra of these filters and data, called band-limiting, during training. The frequency domain constraints apply to both the feed-forward and back-propagation steps. Experimentally, we observe that Convolutional Neural Networks (CNNs) are resilient to this compression scheme and results suggest that CNNs learn to leverage lower-frequency components. In particular, we found: (1) band-limited training can effectively control the resource usage (GPU and memory); (2) models trained with band-limited layers retain high prediction accuracy; and (3) requires no modification to existing training algorithms or neural network architectures to use unlike other compression schemes. |
Year | Venue | DocType |
---|---|---|
2019 | international conference on machine learning | Conference |
Citations | PageRank | References |
3 | 0.39 | 0 |
Authors | ||
5 |
Name | Order | Citations | PageRank |
---|---|---|---|
Adam Dziedzic | 1 | 10 | 3.92 |
Ioannis Paparrizos | 2 | 101 | 11.59 |
S. Krishnan | 3 | 391 | 36.25 |
Aaron J. Elmore | 4 | 352 | 34.03 |
Michael J. Franklin | 5 | 17423 | 1681.10 |