Abstract | ||
---|---|---|
Using neural networks to estimate the probabilities of word sequences has shown significant promise for statistical language modeling. Typical modeling methods include multi-layer neural networks, log-bilinear networks and recurrent neural networks, etc. In this paper, we propose the temporal kernel neural network language model, a variant of models mentioned above. This model explicitly captures long-term dependencies of words with exponential kernel, where the memory of history is decayed exponentially. Additionally, several sentences with variable lengths as a mini-batch are efficiently implemented for speeding up. Experimental results show that the proposed model is very competitive to the recurrent neural network language model and obtains the lower perplexity of 111.6 (more than 10% reduction) than the state-of-the-art results reported in the standard Penn Treebank Corpus. We further apply this model to Wall Street Journal speech recognition task, and observe significant improvements in word error rate. |
Year | DOI | Venue |
---|---|---|
2013 | 10.1109/ICASSP.2013.6639273 | ICASSP |
Keywords | Field | DocType |
language modeling,speech recognition,multilayer neural networks,exponential kernel,wall street journal,word error rate,temporal kernel neural network,operating system kernels,recurrent neural networks,statistical language modeling,word sequences,penn treebank corpus standard,log-bilinear networks,recurrent neural nets,speech recognition task,kernel,vectors,history | Feedforward neural network,Pattern recognition,Computer science,Recurrent neural network,Speech recognition,Probabilistic neural network,Types of artificial neural networks,Time delay neural network,Artificial intelligence,Deep learning,Artificial neural network,Language model | Conference |
Volume | Issue | ISSN |
null | null | 1520-6149 |
Citations | PageRank | References |
4 | 0.46 | 3 |
Authors | ||
4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Yongzhe Shi | 1 | 47 | 5.09 |
Wei-Qiang Zhang | 2 | 136 | 31.22 |
Meng Cai | 3 | 68 | 8.24 |
Jia Liu | 4 | 277 | 50.34 |