Title
An emotion-based responding model for natural language conversation
Abstract
As an important task of artificial intelligence, natural language conversation has attracted wide attention of researchers in natural language processing. Existing works in this field mainly focus on consistency of neural response generation whilst ignoring the effect of emotion state on the response generation. In this paper, we propose an Emotion-based natural language Responding Model (ERM) to address the challenging issue in conversation. ERM encodes the emotion state of the conversation as distributed embedding into the process of response generation, redefines an objective function that jointly trains our model and introduces a novel re-rank function to select the appropriate response. Experimental results on Chinese conversation dataset show that our method yields qualitative performance improvements in the Perplexity (PPL), Word Error-rate (WER) and Bilingual Evaluation Understudy (BLEU) compared with the baseline sequence-to-sequence (Seq2Seq) model, and achieves better performance than the state-of-the-art in terms of emotion and content consistency of the response.
Year
DOI
Venue
2019
10.1007/s11280-018-0601-2
World Wide Web
Keywords
Field
DocType
Natural language conversation, Response generation, Distributed embedding, Objective function, Re-rank function
Perplexity,Conversation,Embedding,Computer science,Natural language,Artificial intelligence,Natural language processing,Machine learning
Journal
Volume
Issue
ISSN
22
SP2
1573-1413
Citations 
PageRank 
References 
2
0.36
34
Authors
6
Name
Order
Citations
PageRank
feng liu118039.13
Qirong Mao226134.29
Liangjun Wang371.79
Nelson Ruwa441.73
Jianping Gou511624.01
Yongzhao Zhan634451.09