Title
Unsupervised Recurrent Neural Network Grammars.
Abstract
Recurrent neural network grammars (RNNG) are generative models of language which jointly model syntax and surface structure by incrementally generating a syntax tree and sentence in a top-down, left-to-right order. Supervised RNNGs achieve strong language modeling and parsing performance, but require an annotated corpus of parse trees. In this work, we experiment with unsupervised learning of RNNGs. Since directly marginalizing over the space of latent trees is intractable, we instead apply amortized variational inference. To maximize the evidence lower bound, we develop an inference network parameterized as a neural CRF constituency parser. On language modeling, unsupervised RNNGs perform as well their supervised counterparts on benchmarks in English and Chinese. On constituency grammar induction, they are competitive with recent neural language models that induce tree structures from words through attention mechanisms.
Year
Venue
Field
2019
arXiv: Computation and Language
Rule-based machine translation,Computer science,Recurrent neural network,Artificial intelligence,Natural language processing
DocType
Volume
Citations 
Journal
abs/1904.03746
1
PageRank 
References 
Authors
0.35
0
6
Name
Order
Citations
PageRank
Yoon Kim1153357.57
Alexander M. Rush2149967.53
Lei Yu322011.55
Adhiguna Kuncoro41818.49
chris dyer55438232.28
Gábor Melis610.35