Title
Incorporating representation learning and multihead attention to improve biomedical cross-sentence n-ary relation extraction.
Abstract
Most biomedical information extraction focuses on binary relations within single sentences. However, extracting n-ary relations that span multiple sentences is in huge demand. At present, in the cross-sentence n-ary relation extraction task, the mainstream method not only relies heavily on syntactic parsing but also ignores prior knowledge. In this paper, we propose a novel cross-sentence n-ary relation extraction method that utilizes the multihead attention and knowledge representation that is learned from the knowledge graph. Our model is built on self-attention, which can directly capture the relations between two words regardless of their syntactic relation. In addition, our method makes use of entity and relation information from the knowledge base to impose assistance while predicting the relation. Experiments on n-ary relation extraction show that combining context and knowledge representations can significantly improve the n-ary relation extraction performance. Meanwhile, we achieve comparable results with state-of-the-art methods. We explored a novel method for cross-sentence n-ary relation extraction. Unlike previous approaches, our methods operate directly on the sequence and learn how to model the internal structures of sentences. In addition, we introduce the knowledge representations learned from the knowledge graph into the cross-sentence n-ary relation extraction. Experiments based on knowledge representation learning show that entities and relations can be extracted in the knowledge graph, and coding this knowledge can provide consistent benefits.
Year
DOI
Venue
2020
10.1186/s12859-020-03629-9
BMC Bioinformatics
Keywords
DocType
Volume
Biomedical n-ary relation, Multihead attention, Representation learning
Journal
21
Issue
ISSN
Citations 
1
1471-2105
0
PageRank 
References 
Authors
0.34
0
6
Name
Order
Citations
PageRank
Di Zhao113.07
Jian Wang2105.83
Yijia Zhang345.47
Xin Wang481.11
Hongfei Lin5768122.52
Zhihao Yang67315.35