Title
Neuron Sparseness Versus Connection Sparseness In Deep Neural Network For Large Vocabulary Speech Recognition
Abstract
Exploiting sparseness in deep neural networks is an important method for reducing the computational cost. In this paper, we study neuron sparseness in deep neural networks for acoustic modeling. For the feed-forward stage, we only activate neurons whose input values are larger than a given threshold, and set the outputs of inactive nodes to zero. Thus, only a few nonzero outputs are fed to the next layer. Using this method, the output vector of each hidden layer becomes very sparse, so that the computational cost of the feed-forward algorithm can be reduced by adopting sparse matrix operations. The proposed method is evaluated in both small and large vocabulary speech recognition tasks, and results demonstrate that we can reduce the nonzero outputs to fewer than 20% of the total number of hidden nodes, without sacrificing speech recognition performance.
Year
Venue
Keywords
2015
2015 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING (ICASSP)
speech recognition, deep neural network, sparseness, acoustic modeling
Field
DocType
ISSN
Vocabulary speech recognition,Pattern recognition,Computer science,Speech recognition,Time delay neural network,Artificial intelligence,Speech recognition performance,Hidden Markov model,Artificial neural network,Vocabulary,Deep neural networks,Sparse matrix
Conference
1520-6149
Citations 
PageRank 
References 
1
0.35
10
Authors
5
Name
Order
Citations
PageRank
J. Kang131.74
Cheng Lu241.42
Meng Cai3688.24
Wei-Qiang Zhang413631.22
Jia Liu527750.34