Title
Learning Multi-Level Information for Dialogue Response Selection by Highway Recurrent Transformer
Abstract
•A new variant of attention mechanisms focuses on modeling cross-sentence attention.•A novel model integrates highway attention in Transformer for modeling dialogues.•Our model is capable of modeling complex dialogue-level information.•The results on two response selection datasets show consistent performance.
Year
DOI
Venue
2019
10.1016/j.csl.2020.101073
Computer Speech & Language
Keywords
DocType
Volume
Response selection,Transformer,Attention mechanism,Dialogue,DSTC
Journal
63
ISSN
Citations 
PageRank 
0885-2308
0
0.34
References 
Authors
0
4
Name
Order
Citations
PageRank
Ting-Rui Chiang101.69
Chao-Wei Huang201.01
Shang-Yu Su394.88
Yun-Nung Chen432435.41