Title
TWilBert: Pre-trained deep bidirectional transformers for Spanish Twitter
Abstract
In recent years, the Natural Language Processing community have been moving from uncontextualized word embeddings towards contextualized word embeddings. Among these contextualized architectures, BERT stands out due to its capacity to compute bidirectional contextualized word representations. However, its competitive performance in English downstream tasks is not obtained by its multilingual version when it is applied to other languages and domains. This is especially true in the case of the Spanish language used in Twitter.
Year
DOI
Venue
2021
10.1016/j.neucom.2020.09.078
Neurocomputing
Keywords
DocType
Volume
Contextualized Embeddings,Spanish,Twitter,TWilBERT
Journal
426
ISSN
Citations 
PageRank 
0925-2312
0
0.34
References 
Authors
0
3
Name
Order
Citations
PageRank
José-Ángel González101.69
lluis f hurtado214224.68
Ferran Pla317330.71