Title
LMSOC - An Approach for Socially Sensitive Pretraining.
Abstract
While large-scale pretrained language models have been shown to learn effective linguistic representations for many NLP tasks, there remain many real-world contextual aspects of language that current approaches do not capture. For instance, consider a cloze-test "I enjoyed the ____ game this weekend": the correct answer depends heavily on where the speaker is from, when the utterance occurred, and the speaker's broader social milieu and preferences. Although language depends heavily on the geographical, temporal, and other social contexts of the speaker, these elements have not been incorporated into modern transformer-based language models. We propose a simple but effective approach to incorporate speaker social context into the learned representations of large-scale language models. Our method first learns dense representations of social contexts using graph representation learning algorithms and then primes language model pretraining with these social context representations. We evaluate our approach on geographically-sensitive language-modeling tasks and show a substantial improvement (more than 100% relative lift on MRR) compared to baselines.
Year
Venue
DocType
2021
EMNLP
Conference
Volume
Citations 
PageRank 
2021.findings-emnlp
0
0.34
References 
Authors
0
3
Name
Order
Citations
PageRank
Vivek Kulkarni100.34
Shubhanshu Mishra200.34
Aria Haghighi300.34