Title
Exploring Fine-Tuned Embeddings that Model Intensifiers for Emotion Analysis.
Abstract
Adjective phrases like little bit surprised, completely shocked, or stunned at all are not handled properly by currently published state-of-the-art emotion classification and intensity prediction systems which use pre-dominantly non-contextualized word embeddings as input. Based on this finding, we analyze differences between embeddings used by these systems in regard to their capability of handling such cases. Furthermore, we argue that intensifiers in context of emotion words need special treatment, as is established for sentiment polarity classification, but not for more fine-grained emotion prediction. To resolve this issue, we analyze different aspects of a post-processing pipeline which enriches the word representations of such phrases. This includes expansion of semantic spaces at the phrase level and sub-word level followed by retrofitting to emotion lexica. We evaluate the impact of these steps with A La Carte and Bag-of-Substrings extensions based on pretrained GloVe, Word2vec, and fastText embeddings against a crowd-sourced corpus of intensity annotations for tweets containing our focus phrases. We show that the fastText-based models do not gain from handling these specific phrases under inspection. For Word2vec embeddings, we show that our post-processing pipeline improves the results by up to 8% on a novel dataset densely populated with intensifiers.
Year
DOI
Venue
2019
10.18653/v1/w19-1304
North American Chapter of the Association for Computational Linguistics
Field
DocType
Volume
Computer science,As is,Emotion classification,Phrase,Natural language processing,Artificial intelligence,Word2vec,Adjective
Journal
abs/1904.03164
Citations 
PageRank 
References 
0
0.34
0
Authors
2
Name
Order
Citations
PageRank
Laura Ana Maria Bostan120.72
Roman Klinger220129.85