Title
Improving Neural Text Style Transfer by Introducing Loss Function Sequentiality
Abstract
ABSTRACTText style transfer is an important issue for conversational agents as it may adapt utterance production to specific dialogue situations. It consists in introducing a given style within a sentence while preserving its semantics. Within this scope, different strategies have been proposed that either rely on parallel data or take advantage of non-supervised techniques. In this paper, we follow the latter approach and show that the sequential introduction of different loss functions into the learning process can boost the performance of a standard model. We also evidence that combining different style classifiers that either focus on global or local textual information improves sentence generation. Experiments on the Yelp dataset show that our methodology strongly competes with the current state-of-the-art models across style accuracy, grammatical correctness, and content preservation.
Year
DOI
Venue
2021
10.1145/3404835.3463026
Research and Development in Information Retrieval
Keywords
DocType
Citations 
Text style transfer, sentiment transfer, loss function sequentiality, combining style classifiers
Conference
0
PageRank 
References 
Authors
0.34
0
4
Name
Order
Citations
PageRank
Chinmay Rane100.34
Gaël Dias235441.95
Alexis Lechervy363.52
Asif Ekbal4737119.31