Title
Attention uncovers task-relevant semantics in emotional narrative understanding
Abstract
Attention mechanisms in deep neural network models have helped them to achieve exceptional performance at complex natural language processing tasks. Previous attempts to investigate what these models have been “paying attention to” suggest that these attention representations capture syntactic information, but there is less evidence for semantics. In this paper, we investigate the capability of an attention mechanism to “attend to” semantically meaningful words. Using a dataset of naturalistic emotional narratives, we first build a Window-Based Attention (WBA) consisting of a hierarchical, two-level long short-term memory (LSTM) with softmax attention. Our model outperforms state-of-the-art models at predicting emotional valence, and even surpassing average human performance. Next, we show in detailed analyses, including word deletion experiments and visualizations, that words that receive higher attention weights in our model also tend to have greater emotional semantic meaning. Experimental results using six different pre-trained word embeddings suggest that deep neural network models which achieve human-level performance may learn to place greater attention weights on words that humans find semantically meaningful to the task at hand.
Year
DOI
Venue
2021
10.1016/j.knosys.2021.107162
Knowledge-Based Systems
Keywords
DocType
Volume
Explainable AI,Emotion understanding,Neural network attention
Journal
226
ISSN
Citations 
PageRank 
0950-7051
0
0.34
References 
Authors
0
3
Name
Order
Citations
PageRank
Thanh Son Nguyen143.14
Zhengxuan Wu211.03
Desmond Ong3105.23