Title
Optimization Techniques to Improve Training Speed of Deep Neural Networks for Large Speech Tasks
Abstract
While Deep Neural Networks (DNNs) have achieved tremendous success for large vocabulary continuous speech recognition (LVCSR) tasks, training these networks is slow. Even to date, the most common approach to train DNNs is via stochastic gradient descent, serially on one machine. Serial training, coupled with the large number of training parameters (i.e., 10–50 million) and speech data set sizes (i.e., 20–100 million training points) makes DNN training very slow for LVCSR tasks. In this work, we explore a variety of different optimization techniques to improve DNN training speed. This includes parallelization of the gradient computation during cross-entropy and sequence training, as well as reducing the number of parameters in the network using a low-rank matrix factorization. Applying the proposed optimization techniques, we show that DNN training can be sped up by a factor of 3 on a 50-hour English Broadcast News (BN) task with no loss in accuracy. Furthermore, using the proposed techniques, we are able to train DNNs on a 300-hr Switchboard (SWB) task and a 400-hr English BN task, showing improvements between 9–30% relative over a state-of-the art GMM/HMM system while the number of parameters of the DNN is smaller than the GMM/HMM system.
Year
DOI
Venue
2013
10.1109/TASL.2013.2284378
IEEE Transactions on Audio, Speech & Language Processing
Keywords
Field
DocType
speech recognition,entropy,hidden markov models,matrix decomposition,neural nets
Computer science,Artificial intelligence,Artificial neural network,Deep neural networks,Computation,Broadcasting,Stochastic gradient descent,Pattern recognition,Matrix decomposition,Speech recognition,Hidden Markov model,Vocabulary,Machine learning
Journal
Volume
Issue
ISSN
21
11
1558-7916
Citations 
PageRank 
References 
15
0.71
23
Authors
4
Name
Order
Citations
PageRank
Tara N. Sainath13497232.43
B. Kingsbury24175335.43
Hagen Soltau379567.33
Bhuvana Ramabhadran41779153.83