Title | ||
---|---|---|
Learning Multi-Level Information for Dialogue Response Selection by Highway Recurrent Transformer |
Abstract | ||
---|---|---|
•A new variant of attention mechanisms focuses on modeling cross-sentence attention.•A novel model integrates highway attention in Transformer for modeling dialogues.•Our model is capable of modeling complex dialogue-level information.•The results on two response selection datasets show consistent performance. |
Year | DOI | Venue |
---|---|---|
2019 | 10.1016/j.csl.2020.101073 | Computer Speech & Language |
Keywords | DocType | Volume |
Response selection,Transformer,Attention mechanism,Dialogue,DSTC | Journal | 63 |
ISSN | Citations | PageRank |
0885-2308 | 0 | 0.34 |
References | Authors | |
0 | 4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Ting-Rui Chiang | 1 | 0 | 1.69 |
Chao-Wei Huang | 2 | 0 | 1.01 |
Shang-Yu Su | 3 | 9 | 4.88 |
Yun-Nung Chen | 4 | 324 | 35.41 |