Title
Gepc: Global Embeddings With Pid Control
Abstract
Global vectors, or global embeddings, are important word representations for many natural language processing tasks. With the popularity of dynamic embeddings (also known as contextual embeddings, such as ELMo and BERT) in recent years, attentions on global vectors have been diverted to a large extent. While, compared to the dynamic embeddings, the global embeddings are faster to train, straightforward to interpret, and eligible to be evaluated by many standard and credible intrinsic benchmarks (e.g., word similarity correlation and analogy accuracy). Thus, they are still widely-used in numerous downstream applications until now. However, the model design of the global embeddings has some limitations, making the learned word representations suboptimal. In this paper, we propose a novel method to deal with these limitations using PID control. To the best of our knowledge, this is one of the first efforts to leverage PID control in the research of word embed dings. Empirical results on standard intrinsic and extrinsic benchmarks show consistent performance boost of the proposed method, suggesting that the method proposed in this paper can be considered as a promising alternative to learn better word representations for the downstream tasks. ? 2021 Elsevier Ltd. All rights reserved.
Year
DOI
Venue
2021
10.1016/j.csl.2021.101197
COMPUTER SPEECH AND LANGUAGE
Keywords
DocType
Volume
Natural language processing, Representation learning, Word embedding, Global vectors
Journal
68
ISSN
Citations 
PageRank 
0885-2308
0
0.34
References 
Authors
0
4
Name
Order
Citations
PageRank
Ning Gong100.68
Nianmin Yao215921.57
Ziying Lv300.34
Shibin Wang400.68