Abstract | ||
---|---|---|
The desire to map neural networks to varying-capacity devices has led to the development of a wealth of compression techniques, many of which involve replacing standard convolutional blocks in a large network with cheap alternative blocks. However, not all blocks are created equally; for a required compute budget there may exist a potent combination of many different cheap blocks, though exhaustively searching for such a combination is prohibitively expensive. In this work, we develop BlockSwap: a fast algorithm for choosing networks with interleaved block types by passing a single minibatch of training data through randomly initialised networks and gauging their Fisher potential. These networks can then be used as students and distilled with the original large network as a teacher. We demonstrate the effectiveness of the chosen networks across CIFAR-10 and ImageNet for classification, and COCO for detection, and provide a comprehensive ablation study of our approach. BlockSwap quickly explores possible block configurations using a simple architecture ranking system, yielding highly competitive networks in orders of magnitude less time than most architecture search techniques (e.g. under 5 minutes on a single GPU for CIFAR-10). |
Year | Venue | Keywords |
---|---|---|
2020 | ICLR | model compression, architecture search, efficiency, budget, convolutional neural networks |
DocType | Citations | PageRank |
Conference | 0 | 0.34 |
References | Authors | |
37 | 5 |
Name | Order | Citations | PageRank |
---|---|---|---|
Jack Turner | 1 | 1 | 1.70 |
Elliot J. Crowley | 2 | 1 | 1.36 |
Michael O'Boyle | 3 | 405 | 19.81 |
Amos J. Storkey | 4 | 1 | 1.36 |
Gray, Gavin | 5 | 11 | 1.94 |