Title
Natural Language Multitasking: Analyzing and Improving Syntactic Saliency of Hidden Representations.
Abstract
We train multi-task autoencoders on linguistic tasks and analyze the learned hidden sentence representations. The representations change significantly when translation and part-of-speech decoders are added. The more decoders a model employs, the better it clusters sentences according to their syntactic similarity, as the representation space becomes less entangled. We explore the structure of the representation space by interpolating between sentences, which yields interesting pseudo-English sentences, many of which have recognizable syntactic structure. Lastly, we point out an interesting property of our models: The difference-vector between two sentences can be added to change a third sentence with similar features in a meaningful way.
Year
Venue
Field
2018
international conference on neural information processing
Salience (neuroscience),Computer science,Natural language,Natural language processing,Artificial intelligence,Human multitasking,Sentence,Syntax,Syntactic structure
DocType
Volume
Citations 
Journal
abs/1801.06024
2
PageRank 
References 
Authors
0.35
5
4
Name
Order
Citations
PageRank
Gino Brunner1205.76
Yuyi Wang2113.69
Rogert Wattenhofer35580384.89
Michael Weigelt420.69