Title
Riemannian Optimization For Skip-Gram Negative Sampling
Abstract
Skip-Gram Negative Sampling (SGNS) word embedding model, well known by its implementation in "word2vec" software, is usually optimized by stochastic gradient descent. However, the optimization of SGNS objective can be viewed as a problem of searching for a good matrix with the low-rank constraint. The most standard way to solve this type of problems is to apply Riemannian optimization framework to optimize the SGNS objective over the manifold of required low-rank matrices. In this paper, we propose an algorithm that optimizes SGNS objective using Riemannian optimization and demonstrates its superiority over popular competitors, such as the original method to train SGNS and SVD over SPPMI matrix.
Year
DOI
Venue
2017
10.18653/v1/P17-1185
PROCEEDINGS OF THE 55TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2017), VOL 1
DocType
Volume
Citations 
Conference
abs/1704.08059
1
PageRank 
References 
Authors
0.38
14
5
Name
Order
Citations
PageRank
Alexander Fonarev111.06
Oleksii Hrinchuk262.86
Gleb Gusev319016.53
Pavel Serdyukov4134190.10
Ivan V. Oseledets530641.96