Title | ||
---|---|---|
Learning to Organize a Bag of Words into Sentences with Neural Networks: An Empirical Study |
Abstract | ||
---|---|---|
Sequential information, a.k.a., orders, is assumed to be essential for processing a sequence with recurrent neural network or convolutional neural network based encoders. However, is it possible to encode natural languages without orders? Given a bag of words from a disordered sentence, humans may still be able to understand what those words mean by reordering or reconstructing them. Inspired by such an intuition, in this paper, we perform a study to investigate how “order” information takes effects in natural language learning. By running comprehensive comparisons, we quantitatively compare the ability of several representative neural models to organize sentences from a bag of words under three typical scenarios, and summarize some empirical findings and challenges, which can shed light on future research on this line of work. |
Year | Venue | DocType |
---|---|---|
2021 | NAACL-HLT | Conference |
Citations | PageRank | References |
0 | 0.34 | 0 |
Authors | ||
6 |
Name | Order | Citations | PageRank |
---|---|---|---|
Chongyang Tao | 1 | 50 | 12.29 |
Shen Gao | 2 | 35 | 10.30 |
Juntao Li | 3 | 0 | 0.68 |
Yansong Feng | 4 | 735 | 64.17 |
Dongyan Zhao | 5 | 998 | 96.35 |
Rui Yan | 6 | 961 | 76.69 |