Title
Characterising Across-Stack Optimisations for Deep Convolutional Neural Networks
Abstract
Convolutional Neural Networks (CNNs) are extremely computationally demanding, presenting a large barrier to their deployment on resource-constrained devices. Since such systems are where some of their most useful applications lie (e.g. obstacle detection for mobile robots, vision-based medical assistive technology), significant bodies of work from both machine learning and systems communities have attempted to provide optimisations that will make CNNs available to edge devices. In this paper we unify the two viewpoints in a Deep Learning Inference Stack and take an across-stack approach by implementing and evaluating the most common neural network compression techniques (weight pruning, channel pruning, and quantisation) and optimising their parallel execution with a range of programming approaches (OpenMP, OpenCL) and hardware architectures (CPU, GPU). We provide comprehensive Pareto curves to instruct trade-offs under constraints of accuracy, execution time, and memory space.
Year
DOI
Venue
2018
10.1109/IISWC.2018.8573503
2018 IEEE International Symposium on Workload Characterization (IISWC)
Keywords
DocType
Volume
across-stack optimisations,deep convolutional neural networks,resource-constrained devices,obstacle detection,mobile robots,vision-based medical assistive technology,machine learning,across-stack approach,weight pruning,channel pruning,CNNs,neural network compression techniques,deep learning inference stack,quantisation,OpenMP,OpenCL,hardware architectures,Pareto curves
Conference
abs/1809.07196
ISBN
Citations 
PageRank 
978-1-5386-6781-1
7
0.55
References 
Authors
25
6
Name
Order
Citations
PageRank
Jack Turner192.26
José Cano2123.32
Valentin Radu3604.69
Elliot Crowley4534.84
Michael O'Boyle540519.81
Amos J. Storkey695594.20