Title
Language as a Latent Variable: Discrete Generative Models for Sentence Compression.
Abstract
In this work we explore deep generative models of text in which the latent representation of a document is itself drawn from a discrete language model distribution. We formulate a variational auto-encoder for inference in this model and apply it to the task of compressing sentences. In this application the generative model first draws a latent summary sentence from a background language model, and then subsequently draws the observed sentence conditioned on this latent summary. In our empirical evaluation we show that generative formulations of both abstractive and extractive compression yield state-of-the-art results when trained on a large amount of supervised data. Further, we explore semi-supervised compression scenarios where we show that it is possible to achieve performance competitive with previously proposed supervised models while training on a fraction of the supervised data.
Year
DOI
Venue
2016
10.18653/v1/D16-1031
EMNLP
DocType
Volume
Citations 
Conference
abs/1609.07317
37
PageRank 
References 
Authors
1.19
22
2
Name
Order
Citations
PageRank
Yishu Miao117811.44
Phil Blunsom23130152.18