Title
An Investigation Of Neural Embeddings For Coreference Resolution
Abstract
Coreference Resolution is an important task in Natural Language Processing (NLP) and involves finding all the phrases in a document that refer to the same entity in the real world, with applications in question answering and document summarisation. Work from deep learning has led to the training of neural embeddings of words and sentences from unlabelled text. Word embeddings have been shown to capture syntactic and semantic properties of the words and have been used in POS tagging and NER tagging to achieve state of the art performance. Therefore, the key contribution of this paper is to investigate whether neural embeddings can be leveraged to overcome challenges associated with the scarcity of coreference resolution labelled datasets for benchmarking. We show, as a preliminary result, that neural embeddings improve the performance of a coreference resolver when compared to a baseline.
Year
DOI
Venue
2015
10.1007/978-3-319-18111-0_19
COMPUTATIONAL LINGUISTICS AND INTELLIGENT TEXT PROCESSING (CICLING 2015), PT I
Keywords
Field
DocType
coreference resolution, neural embeddings, deep learning
Resolver,Coreference,Question answering,Information retrieval,Computer science,Semantic property,Natural language processing,Artificial intelligence,Deep learning,Syntax,Benchmarking
Conference
Volume
ISSN
Citations 
9041
0302-9743
0
PageRank 
References 
Authors
0.34
21
3
Name
Order
Citations
PageRank
Varun Godbole100.34
W. Liu220.75
Roberto Togneri381448.33