Title
Improvements To N-Gram Language Model Using Text Generated From Neural Language Model
Abstract
Although neural language models have emerged, n-gram language models are still used for many speech recognition tasks. This paper proposes four methods to improve n-gram language models using text generated from a recurrent neural network language model (RNNLM). First, we use multiple RNNLMs from different domains instead of a single RNNLM. The final n-gram language model is obtained by interpolating generated n-gram models from each domain. Second, we use subwords instead of words for RNNLM to reduce the out-of-vocabulary rate. Third, we generate text templates using an RNNLM for template-based data augmentation for named entities. Fourth, we use both forward RNNLM and backward RNNLM to generate text. We found that these four methods improved performance of speech recognition up to 4% relative in various tasks.
Year
DOI
Venue
2019
10.1109/icassp.2019.8683481
2019 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP)
Keywords
Field
DocType
n-gram, RNNLM, interpolation, subword, template
Pattern recognition,Computer science,Recurrent neural network,Speech recognition,Artificial intelligence,n-gram,Template,Language model
Conference
ISSN
Citations 
PageRank 
1520-6149
0
0.34
References 
Authors
0
5
Name
Order
Citations
PageRank
Masayuki Suzuki1235.88
Nobuyasu Itoh26513.19
Tohru Nagano3959.27
Gakuto Kurata410719.06
Samuel Thomas553646.88