Title
Diverse Embedding Neural Network Language Models.
Abstract
We propose Diverse Embedding Neural Network (DENN), a novel architecture for language models (LMs). A DENNLM projects the input word history vector onto multiple diverse low-dimensional sub-spaces instead of a single higher-dimensional sub-space as in conventional feed-forward neural network LMs. We encourage these sub-spaces to be diverse during network training through an augmented loss function. Our language modeling experiments on the Penn Treebank data set show the performance benefit of using a DENNLM.
Year
Venue
Field
2014
International Conference on Learning Representations
Architecture,Embedding,Computer science,Neural network language models,Time delay neural network,Natural language processing,Treebank,Artificial intelligence,Artificial neural network,Machine learning,Language model
DocType
Volume
Citations 
Journal
abs/1412.7063
0
PageRank 
References 
Authors
0.34
3
3
Name
Order
Citations
PageRank
Kartik Audhkhasi118923.25
Abhinav Sethy236331.16
Bhuvana Ramabhadran31779153.83