Title
Dual Inference for Improving Language Understanding and Generation.
Abstract
Natural language understanding (NLU) and Natural language generation (NLG) tasks hold a strong dual relationship, where NLU aims at predicting semantic labels based on natural language utterances and NLG does the opposite. The prior work mainly focused on exploiting the duality in model training in order to obtain the models with better performance. However, regarding the fast-growing scale of models in the current NLP area, sometimes we may have difficulty retraining whole NLU and NLG models. To better address the issue, this paper proposes to leverage the duality in the inference stage without the need of retraining. The experiments on three benchmark datasets demonstrate the effectiveness of the proposed method in both NLU and NLG, providing the great potential of practical usage.
Year
DOI
Venue
2020
10.18653/V1/2020.FINDINGS-EMNLP.443
EMNLP
DocType
Volume
Citations 
Conference
2020.findings-emnlp
0
PageRank 
References 
Authors
0.34
0
3
Name
Order
Citations
PageRank
Shang-Yu Su194.88
Yung-Sung Chuang2143.74
Yun-Nung Chen332435.41