Title
HWE: Word Embedding with Heterogeneous Features
Abstract
Distributed word representation is widely used in Natural Language Processing. However, traditional approaches, which learn word representations from the co-occurrence information in large corpora, might not capture fine-grained syntactic and semantic information. In this paper, we propose a general and flexible framework to incorporate heterogeneous features (e.g., word-sense, part-of-speech, topic) for learning feature-specific word embeddings in an explicit fashion, namely Heterogeneous Word Embedding (HWE). Experimental results on both intrinsic and extrinsic tasks show that HWE outperforms the baseline and various state-of-the-art models. Moreover, through the concatenation over HWE and the corresponding feature embeddings, each word would have different contextual representation under different contexts, which achieves even more significant improvement. Finally, we illustrate the insight of our model via visualization of the learned word embeddings.
Year
DOI
Venue
2019
10.1109/ICOSC.2019.8665508
2019 IEEE 13th International Conference on Semantic Computing (ICSC)
Keywords
Field
DocType
Semantics,Task analysis,Training,Natural language processing,Syntactics,Context modeling,Feature extraction
Task analysis,Visualization,Computer science,Feature extraction,Context model,Concatenation,Artificial intelligence,Natural language processing,Word embedding,Syntax,Semantics
Conference
ISSN
ISBN
Citations 
2325-6516
978-1-5386-6783-5
0
PageRank 
References 
Authors
0.34
0
4
Name
Order
Citations
PageRank
Jhih-Sheng Fan100.34
Mu Yang200.34
Peng-Hsuan Li361.77
Wei-Yun Ma418721.17