Title
Enriching contextualized language model from knowledge graph for biomedical information extraction
Abstract
Biomedical information extraction (BioIE) is an important task. The aim is to analyze biomedical texts and extract structured information such as named entities and semantic relations between them. In recent years, pre-trained language models have largely improved the performance of BioIE. However, they neglect to incorporate external structural knowledge, which can provide rich factual information to support the underlying understanding and reasoning for biomedical information extraction. In this paper, we first evaluate current extraction methods, including vanilla neural networks, general language models and pre-trained contextualized language models on biomedical information extraction tasks, including named entity recognition, relation extraction and event extraction. We then propose to enrich a contextualized language model by integrating a large scale of biomedical knowledge graphs (namely, BioKGLM). In order to effectively encode knowledge, we explore a three-stage training procedure and introduce different fusion strategies to facilitate knowledge injection. Experimental results on multiple tasks show that BioKGLM consistently outperforms state-of-the-art extraction models. A further analysis proves that BioKGLM can capture the underlying relations between biomedical knowledge concepts, which are crucial for BioIE.
Year
DOI
Venue
2021
10.1093/bib/bbaa110
BRIEFINGS IN BIOINFORMATICS
Keywords
DocType
Volume
biomedical information extraction, language model, neural network, knowledge graph
Journal
22
Issue
ISSN
Citations 
3
1467-5463
1
PageRank 
References 
Authors
0.35
21
5
Name
Order
Citations
PageRank
Hao Fei110.35
Yafeng Ren210213.57
Yue Zhang31364114.17
Donghong Ji4892120.08
Xiaohui Liang520.78