Abstract | ||
---|---|---|
Deep contextualized word embeddings (Embeddings from Language Model, short for ELMo), as an emerging and effective replacement for the static word embeddings, have achieved success on a bunch of syntactic and semantic NLP problems. However, little is known about what is responsible for the improvements. In this article, we focus on the effect of ELMo for a typical syntax problem—universal POS tagging and dependency parsing. We incorporate ELMo as additional word embeddings into the state-of-the-art POS tagger and dependency parser, and it leads to consistent performance improvements. Experimental results show the model using ELMo outperforms the state-of-the-art baseline by an average of 0.91 for POS tagging and 1.11 for dependency parsing. Further analysis reveals that the improvements mainly result from the ELMo’s better abstraction ability on the out-of-vocabulary (OOV) words, and the character-level word representation in ELMo contributes a lot to the abstraction. Based on ELMo’s advantage on OOV, experiments that simulate low-resource settings are conducted and the results show that deep contextualized word embeddings are effective for data-insufficient tasks where the OOV problem is severe.
|
Year | DOI | Venue |
---|---|---|
2020 | 10.1145/3326497 | ACM Transactions on Asian and Low-Resource Language Information Processing (TALLIP) |
Keywords | Field | DocType |
Natural language processing,POS tagging,deep contextualized word embeddings,out-of-vocabulary words,universal dependency parsing,visualization | Word representation,Abstraction,Visualization,Computer science,Dependency grammar,Natural language processing,Artificial intelligence,Syntax,Language model | Journal |
Volume | Issue | ISSN |
19 | 1 | 2375-4699 |
Citations | PageRank | References |
0 | 0.34 | 0 |
Authors | ||
6 |
Name | Order | Citations | PageRank |
---|---|---|---|
Yijia Liu | 1 | 0 | 2.03 |
Wanxiang Che | 2 | 711 | 66.39 |
Yuxuan Wang | 3 | 144 | 12.04 |
bo zheng | 4 | 25 | 7.29 |
Bing Qin | 5 | 1076 | 72.82 |
Ting Liu | 6 | 2735 | 232.31 |