Title
Embedding Lexical Features via Low-Rank Tensors.
Abstract
Modern NLP models rely heavily on engineered features, which often combine word and contextual information into complex lexical features. Such combination results in large numbers of features, which can lead to over-fitting. We present a new model that represents complex lexical features---comprised of parts for words, contextual information and labels---in a tensor that captures conjunction information among these parts. We apply low-rank tensor approximations to the corresponding parameter tensors to reduce the parameter space and improve prediction speed. Furthermore, we investigate two methods for handling features that include $n$-grams of mixed lengths. Our model achieves state-of-the-art results on tasks in relation extraction, PP-attachment, and preposition disambiguation.
Year
DOI
Venue
2016
10.18653/v1/N16-1117
HLT-NAACL
DocType
Volume
Citations 
Conference
abs/1604.00461
0
PageRank 
References 
Authors
0.34
26
4
Name
Order
Citations
PageRank
Mo Yu179047.80
Mark Dredze23092176.22
R. Arora348935.97
Matthew Gormley48410.25