Title
Neural Language Modeling by Jointly Learning Syntax and Lexicon.
Abstract
We propose a neural language model capable of unsupervised syntactic structure induction. The model leverages the structure information to form better semantic representations and better language modeling. Standard recurrent neural networks are limited by their structure and fail to efficiently use syntactic information. On the other hand, tree-structured recursive networks usually require additional structural supervision at the cost of human expert annotation. In this paper, We propose a novel neural language model, called the Parsing-Reading-Predict Networks (PRPN), that can simultaneously induce the syntactic structure from unannotated sentences and leverage the inferred structure to learn a better language model. In our model, the gradient can be directly back-propagated from the language model loss into the neural parsing network. Experiments show that the proposed model can discover the underlying syntactic structure and achieve state-of-the-art performance on word/character-level language model tasks.
Year
Venue
Field
2017
international conference on learning representations
Annotation,Computer science,Recurrent neural network,Lexicon,Natural language processing,Artificial intelligence,Parsing,Syntax,Language model,Recursion,Syntactic structure
DocType
Volume
Citations 
Journal
abs/1711.02013
5
PageRank 
References 
Authors
0.42
31
4
Name
Order
Citations
PageRank
Yikang Shen1356.62
Zhouhan Lin241917.51
Chin-Wei Huang385.18
Aaron C. Courville46671348.46