Title
Incremental Skip-gram Model with Negative Sampling.
Abstract
This paper explores an incremental training strategy for the skip-gram model with negative sampling (SGNS) from both empirical and theoretical perspectives. Existing methods of neural word embeddings, including SGNS, are multi-pass algorithms and thus cannot perform incremental model update. To address this problem, we present a simple incremental extension of SGNS and provide a thorough theoretical analysis to demonstrate its validity. Empirical experiments demonstrated the correctness of the theoretical analysis as well as the practical usefulness of the incremental algorithm.
Year
DOI
Venue
2017
10.18653/v1/d17-1037
EMNLP
DocType
Volume
Citations 
Conference
abs/1704.03956
5
PageRank 
References 
Authors
0.43
12
2
Name
Order
Citations
PageRank
Nobuhiro Kaji125721.71
hayato kobayashi2296.07