Title
Understanding the dropout strategy and analyzing its effectiveness on LVCSR
Abstract
The work by Hinton et al shows that the dropout strategy can greatly improve the performance of neural networks as well as reducing the influence of over-fitting. Nevertheless, there is still not a more detailed study on this strategy. In addition, the effectiveness of dropout on the task of LVCSR has not been analyzed. In this paper, we attempt to make a further discussion on the dropout strategy. The impacts on performance of different dropout probabilities for phone recognition task are experimented on TIMIT. To get an in-depth understanding of dropout, experiments of dropout testing are designed from the perspective of model averaging. The effectiveness of dropout is analyzed on a LVCSR task. Results show that the method of dropout fine-tuning combined with standard back-propagation gives significant performance improvements.
Year
DOI
Venue
2013
10.1109/ICASSP.2013.6639144
ICASSP
Keywords
Field
DocType
deep neural networks,dropout strategy,lvcsr,speech recognition,large vocabulary continuous speech recognition,timit,dropout,standard backpropagation,model averaging,dropout fine-tuning method,over-fitting influence reduction,backpropagation,dropout probabilities,dropout testing,phone recognition task,neural network performance improvement,neural nets,probability,vectors,testing,neural networks,hidden markov models,accuracy
TIMIT,Computer science,Speech recognition,Phone,Artificial intelligence,Artificial neural network,Backpropagation,Machine learning
Conference
Volume
Issue
ISSN
null
null
1520-6149
Citations 
PageRank 
References 
8
0.74
7
Authors
3
Name
Order
Citations
PageRank
Jie Li191.70
Xiaorui Wang2196.13
Bo Xu324136.59