Abstract | ||
---|---|---|
We propose two improvements on lexical association used in embedding learning: factorizing individual dependency relations and using lexicographic knowledge from monolingual dictionaries. Both proposals provide low-entropy lexical co-occurrence information, and are empirically shown to improve embedding learning by performing notably better than several popular embedding models in similarity tasks. |
Year | Venue | Field |
---|---|---|
2015 | PROCEEDINGS OF THE 53RD ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL) AND THE 7TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (IJCNLP), VOL 2 | Semantic similarity,Lexical similarity,Embedding,Lexical semantics,Computer science,Euclidean distance,Artificial intelligence,Natural language processing,Parsing,Sentence,Syntax |
DocType | Volume | Citations |
Conference | P15-2 | 2 |
PageRank | References | Authors |
0.35 | 18 | 3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Tong Wang | 1 | 85 | 10.63 |
Abdel-rahman Mohamed | 2 | 3772 | 266.13 |
Graeme Hirst | 3 | 2258 | 239.35 |