Title
Learning Compact Neural Word Embeddings by Parameter Space Sharing.
Abstract
The word embedding vectors obtained from neural word embedding methods, such as vLBL models and SkipGram, have become an important fundamental resource for tackling a wide variety of tasks in the artificial intelligence field. This paper focuses on the fact that the model size of high-quality embedding vectors is relatively large, i.e., more than 1GB. We propose a learning framework that can provide a set of 'compact' embedding vectors for the purpose of enhancing 'usability' in actual applications. Our proposed method incorporates parameter sharing constraints into the optimization problem. These additional constraints force the embedding vectors to share parameter values, which significantly shrinks model size. We investigate the trade-off between quality and model size of embedding vectors for several linguistic benchmark datasets, and show that our method can significantly reduce the model size while maintaining the task performance of conventional methods.
Year
Venue
Field
2016
IJCAI
Embedding,Computer science,Usability,Theoretical computer science,Parameter space,Artificial intelligence,Word embedding,Optimization problem,Machine learning
DocType
Citations 
PageRank 
Conference
2
0.37
References 
Authors
16
2
Name
Order
Citations
PageRank
Junichi Suzuki11265112.15
Masaaki Nagata2195.41