Title
Self-organized Hierarchical Softmax.
Abstract
We propose a new self-organizing hierarchical softmax formulation for neural-network-based language models over large vocabularies. Instead of using a predefined hierarchical structure, our approach is capable of learning word clusters with clear syntactical and semantic meaning during the language model training process. We provide experiments on standard benchmarks for language modeling and sentence compression tasks. We find that this approach is as fast as other efficient softmax approximations, while achieving comparable or even better performance relative to similar full softmax models.
Year
Venue
DocType
2017
CoRR
Journal
Volume
Citations 
PageRank 
abs/1707.08588
0
0.34
References 
Authors
0
4
Name
Order
Citations
PageRank
Yikang Shen102.03
Shawn Tan202.37
Christopher Joseph Pal300.34
Aaron C. Courville46671348.46