Title
Self-Attention-Based Message-Relevant Response Generation for Neural Conversation Model.
Abstract
Using a sequence-to-sequence framework, many neural conversation models for chit-chat succeed in naturalness of the response. Nevertheless, the neural conversation models tend to give generic responses which are not specific to given messages, and it still remains as a challenge. To alleviate the tendency, we propose a method to promote message-relevant and diverse responses for neural conversation model by using self-attention, which is time-efficient as well as effective. Furthermore, we present an investigation of why and how effective self-attention is in deep comparison with the standard dialogue generation. The experiment results show that the proposed method improves the standard dialogue generation in various evaluation metrics.
Year
Venue
Field
2018
arXiv: Computation and Language
Conversation,Computer science,Naturalness,Natural language processing,Artificial intelligence
DocType
Volume
Citations 
Journal
abs/1805.08983
0
PageRank 
References 
Authors
0.34
7
3
Name
Order
Citations
PageRank
Jonggu Kim100.68
Doyeon Kong200.68
Jong-Hyeok Lee374097.88