Title | ||
---|---|---|
Deep Word Association: A Flexible Chinese Word Association Method with Iterative Attention Mechanism. |
Abstract | ||
---|---|---|
Word association is to predict the subsequent words and phrase, acting as a reminder to accelerate the text-editing process. Existing word association models can only predict the next word inflexibly through a given word vocabulary or a simply back-off N-gram language model. Herein, we propose a deep word association system based on attention mechanism with the following contributions: (1) To the best of our knowledge, this is the first investigation of an attention-based recurrent neural network for word association. In the experiments, we provide a comprehensive study on the attention processes for the word association problem; (2) An novel approach, named DropContext, is proposed to solve the over-fitting problem during attention training procedure; (3) Compared with conventional vocabulary-based methods, our word association system can generate an arbitrary-length string of words that are reasonable; (4) Given information on different hierarchies, the proposed system can flexibly generate associated words accordingly. |
Year | Venue | Field |
---|---|---|
2018 | PRCV | Computer science,Phrase,Recurrent neural network,Word Association,Natural language processing,Artificial intelligence,Attention training,Hierarchy,Vocabulary,Language model |
DocType | Citations | PageRank |
Conference | 0 | 0.34 |
References | Authors | |
12 | 5 |
Name | Order | Citations | PageRank |
---|---|---|---|
Yao-Xiong Huang | 1 | 2 | 3.45 |
zecheng xie | 2 | 96 | 7.55 |
manfei liu | 3 | 27 | 1.99 |
Shuaitao Zhang | 4 | 30 | 3.86 |
Lianwen Jin | 5 | 1337 | 113.14 |