Abstract | ||
---|---|---|
Relation classification is an important semantic processing, which has achieved great attention in recent years. The main challenge is the fact that important information can appear at any position in the sentence. Therefore, we propose bidirectional long short-term memory networks (BLSTM) to model the sentence with complete, sequential information about all words. At the same time, we also use features derived from the lexical resources such as WordNet or NLP systems such as dependency parser and named entity recognizers (NER). The experimental results on SemEval-2010 show that BLSTMbased method only with word embeddings as input features is sufficient to achieve state-of-the-art performance, and importing more features could further improve the performance. |
Year | Venue | Field |
---|---|---|
2015 | PACLIC | Semantic memory,Computer science,Long short term memory,Named entity,Dependency grammar,Natural language processing,Artificial intelligence,Relation classification,WordNet,Sentence |
DocType | Citations | PageRank |
Conference | 14 | 0.68 |
References | Authors | |
11 | 4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Shu Zhang | 1 | 31 | 5.43 |
Dequan Zheng | 2 | 14 | 1.36 |
Xinchen Hu | 3 | 14 | 0.68 |
Ming Yang | 4 | 3471 | 162.50 |