Title
WRPN: Wide Reduced-Precision Networks.
Abstract
For computer vision applications, prior works have shown the efficacy of reducing numeric precision of model parameters (network weights) in deep neural networks. Activation maps, however, occupy a large memory footprint during both the training and inference step when using mini-batches of inputs. One way to reduce this large memory footprint is to reduce the precision of activations. However, past works have shown that reducing the precision of activations hurts model accuracy. We study schemes to train networks from scratch using reduced-precision activations without hurting accuracy. We reduce the precision of activation maps (along with model parameters) and increase the number of filter maps in a layer, and find that this scheme matches or surpasses the accuracy of the baseline full-precision network. As a result, one can significantly improve the execution efficiency (e.g. reduce dynamic memory footprint, memory band- width and computational energy) and speed up the training and inference process with appropriate hardware support. We call our scheme WRPN -- wide reduced-precision networks. We report results and show that WRPN scheme is better than previously reported accuracies on ILSVRC-12 dataset while being computationally less expensive compared to previously reported reduced-precision networks.
Year
Venue
Field
2017
international conference on learning representations
Dynamic random-access memory,Scratch,Memory bandwidth,Computer science,Inference,Footprint,Artificial intelligence,Memory footprint,Computer engineering,Deep neural networks,Machine learning,Speedup
DocType
Volume
Citations 
Journal
abs/1709.01134
20
PageRank 
References 
Authors
0.73
15
4
Name
Order
Citations
PageRank
Asit K. Mishra1121646.21
Eriko Nurvitadhi239933.08
Jeffrey J. Cook31107.45
Debbie Marr417512.39