Title
Revisiting Representation Degeneration Problem in Language Modeling.
Abstract
Weight tying is now a common setting in many language generation tasks such as language modeling and machine translation. However, a recent study reveals that there is a potential flaw in weight tying. They find that the learned word embeddings are likely to degenerate and lie in a narrow cone when training a language model. They call it the representation degeneration problem and propose a cosine regularization to solve it. Nevertheless, we prove that the cosine regularization is insufficient to solve the problem, as the degeneration is still likely to happen under certain conditions. In this paper, we revisit the representation degeneration problem and theoretically analyze the limitations of the previously proposed solution. Afterward, we propose an alternative regularization method called Laplacian regularization to tackle the problem. Experiments on language modeling demonstrate the effectiveness of the proposed Laplacian regularization.
Year
DOI
Venue
2020
10.18653/V1/2020.FINDINGS-EMNLP.46
EMNLP
DocType
Volume
Citations 
Conference
2020.findings-emnlp
0
PageRank 
References 
Authors
0.34
0
6
Name
Order
Citations
PageRank
Zhong Zhang114132.42
Chongming Gao2112.62
Cong Xu301.01
Rui Miao401.69
Qinli Yang513717.12
Junming Shao622.41