Title
A Power-Efficient Accelerator Based on FPGAs for LSTM Network
Abstract
Today, artificial neural networks (ANNs) are widely used in a variety of applications, including speech recognition, face detection, disease diagnosis, etc. And as the emerging field of ANNs, Long Short-Term Memory (LSTM) is a recurrent neural network (RNN) which contains complex computational logic. To achieve high accuracy, researchers always build large-scale LSTM networks which are time-consuming and power-consuming. In this paper, we present a hardware accelerator for the LSTM neural network layer based on FPGA Zedboard and use pipeline methods to parallelize the forward computing process. We also implement a sparse LSTM hidden layer, which consumes fewer storage resources than the dense network. Our accelerator is power-efficient and has a higher speed than ARM Cortex-A9 processor.
Year
DOI
Venue
2017
10.1109/CLUSTER.2017.45
2017 IEEE International Conference on Cluster Computing (CLUSTER)
Keywords
Field
DocType
power efficient accelerator,FPGA,LSTM network,artificial neural networks,ANN,speech recognition,face detection,disease diagnosis,long short-term memory,recurrent neural network,RNN,complex computational logic,hardware accelerator,LSTM neural network layer,FPGA Zedboard,pipeline methods,forward computing process,ARM Cortex-A9 processor,power-efficient
Computational logic,Computer science,Parallel computing,Field-programmable gate array,Recurrent neural network,Real-time computing,Hardware acceleration,Face detection,Artificial neural network,Energy consumption,Sparse matrix
Conference
ISSN
ISBN
Citations 
1552-5244
978-1-5386-2327-5
2
PageRank 
References 
Authors
0.37
6
8
Name
Order
Citations
PageRank
Yiwei Zhang15212.65
Chao Wang237262.24
Lei Gong36513.52
Yuntao Lu462.59
Fan Sun563.27
Chongchong Xu674.63
Xi Li720236.61
Xuehai Zhou855177.54