Abstract | ||
---|---|---|
Aspect-based sentiment analysis (ABSA) aims to predict fine-grained sentiments of comments with respect to given aspect terms or categories. In previous ABSA methods, the importance of aspect has been realized and verified. Most existing LSTM-based models take aspect into account via the attention mechanism, where the attention weights are calculated after the context is modeled in the form of contextual vectors. However, aspect-related information may be already discarded and aspect-irrelevant information may be retained in classic LSTM cells in the context modeling process, which can be improved to generate more effective context representations. This paper proposes a novel variant of LSTM, termed as aspect-aware LSTM (AA-LSTM), which incorporates aspect information into LSTM cells in the context modeling stage before the attention mechanism. Therefore, our AA-LSTM can dynamically produce aspect-aware contextual representations. We experiment with several representative LSTM-based models by replacing the classic LSTM cells with the AA-LSTM cells. Experimental results on SemEval-2014 Datasets demonstrate the effectiveness of AA-LSTM. |
Year | DOI | Venue |
---|---|---|
2019 | 10.24963/ijcai.2019/738 | IJCAI |
DocType | Volume | Citations |
Conference | abs/1905.07719 | 1 |
PageRank | References | Authors |
0.36 | 0 | 7 |
Name | Order | Citations | PageRank |
---|---|---|---|
Xing Bowen | 1 | 1 | 0.36 |
Lejian Liao | 2 | 35 | 8.86 |
Dandan Song | 3 | 150 | 19.44 |
Jingang Wang | 4 | 5 | 2.12 |
Fuzheng Zhang | 5 | 984 | 41.96 |
Wang Zhongyuan | 6 | 1 | 0.36 |
Heyan Huang | 7 | 173 | 61.47 |