Title
BERT Prescriptions to Avoid Unwanted Headaches - A Comparison of Transformer Architectures for Adverse Drug Event Detection.
Abstract
Pretrained transformer-based models, such as BERT and its variants, have become a common choice to obtain state-of-the-art performances in NLP tasks. In the identification of Adverse Drug Events (ADE) from social media texts, for example, BERT architectures rank first in the leaderboard. However, a systematic comparison between these models has not yet been done. In this paper, we aim at shedding light on the differences between their performance analyzing the results of 12 models, tested on two standard benchmarks. SpanBERT and PubMedBERT emerged as the best models in our evaluation: this result clearly shows that span-based pretraining gives a decisive advantage in the precise recognition of ADEs, and that in-domain language pretraining is particularly useful when the transformer model is trained just on biomedical text from scratch.
Year
Venue
DocType
2021
EACL
Conference
Citations 
PageRank 
References 
0
0.34
0
Authors
5
Name
Order
Citations
PageRank
Beatrice Portelli101.01
Edoardo Lenzi200.34
Emmanuele Chersoni301.35
Giuseppe Serra401.01
Enrico Santus500.34