Title
Multichannel Generative Language Model: Learning All Possible Factorizations Within and Across Channels
Abstract
A channel corresponds to a viewpoint or transformation of an underlying meaning. A pair of parallel sentences in English and French express the same underlying meaning, but through two separate channels corresponding to their languages. In this work, we present the Multichannel Generative Language Model (MGLM). MGLM is a generative joint distribution model over channels. MGLM marginalizes over all possible factorizations within and across all channels. MGLM endows flexible inference, including unconditional generation, conditional generation (where 1 channel is observed and other channels are generated), and partially observed generation (where incomplete observations are spread across all the channels). We experiment with the Multi30K dataset containing English, French, Czech, and German. We demonstrate experiments with unconditional, conditional, and partially conditional generation. We provide qualitative samples sampled unconditionally from the generative joint distribution. We also quantitatively analyze the quality-diversity trade-offs and find MGLM outperforms traditional bilingual discriminative models.
Year
DOI
Venue
2020
10.18653/V1/2020.FINDINGS-EMNLP.376
EMNLP
DocType
Volume
Citations 
Conference
2020.findings-emnlp
0
PageRank 
References 
Authors
0.34
0
3
Name
Order
Citations
PageRank
Harris Chan112.03
Kiros, Ryan2226594.80
William Chan335724.67