Title
Nn Compactor: Minimizing Memory And Logic Resources For Small Neural Networks
Abstract
Special neural accelerators are an appealing hardware platform for machine learning systems because they provide both high performance and energy efficiency. Although various neural accelerators have recently been introduced, they are difficult to adapt to embedded platforms because current neural accelerators require high memory capacity and bandwidth for the fast preparation of synaptic weights. Embedded platforms are often unable to meet these memory requirements because of their limited resources. In FPGA-based IoT (internet of things) systems, the problem becomes even worse since computation units generated from logic blocks cannot be fully utilized due to the small size of block memory. In order to overcome this problem, we propose a novel dual-track quantization technique to reduce synaptic weight width based on the magnitude of the value while minimizing accuracy loss. In this value-adaptive technique, large and small value weights are quantized differently. In this paper, we present a fully automatic framework called NN Compactor that generates a compact neural accelerator by minimizing the memory requirements of synaptic weights through dual-track quantization and minimizing the logic requirements of PUs with minimum recognition accuracy loss. For the three widely used datasets of MNIST, CNAE-9, and Forest, experimental results demonstrate that our compact neural accelerator achieves an average performance improvement of G.4 x over a baseline embedded system using minimal resources with minimal accuracy loss.
Year
Venue
Keywords
2018
PROCEEDINGS OF THE 2018 DESIGN, AUTOMATION & TEST IN EUROPE CONFERENCE & EXHIBITION (DATE)
Neural networks, Accelerator, Automation, Quantization
Field
DocType
ISSN
MNIST database,High memory,Computer science,Field-programmable gate array,Real-time computing,Memory management,Quantization (signal processing),Artificial neural network,Synaptic weight,Computer engineering,Performance improvement
Conference
1530-1591
Citations 
PageRank 
References 
0
0.34
0
Authors
3
Name
Order
Citations
PageRank
Seongmin Hong101.35
Inho Lee211.72
Yongjun Park327720.15