Title | ||
---|---|---|
Learning New Semi-Supervised Deep Auto-Encoder Features For Statistical Machine Translation |
Abstract | ||
---|---|---|
In this paper, instead of designing new features based on intuition, linguistic knowledge and domain, we learn some new and effective features using the deep auto-encoder (DAE) paradigm for phrase-based translation model. Using the unsupervised pre-trained deep belief net (DBN) to initialize DAE's parameters and using the input original phrase features as a teacher for semi-supervised fine-tuning, we learn new semi-supervised DAE features, which are more effective and stable than the unsupervised DBN features. Moreover, to learn high dimensional feature representation, we introduce a natural horizontal composition of more DAEs for large hidden layers feature learning. On two Chinese-English tasks, our semi-supervised DAE features obtain statistically significant improvements of 1.34/2.45 (IWSLT) and 0.82/1.52 (NIST) BLEU points over the unsupervised DBN features and the baseline features, respectively. |
Year | Venue | Field |
---|---|---|
2014 | PROCEEDINGS OF THE 52ND ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 1 | Autoencoder,Computer science,Machine translation,Phrase,Intuition,NIST,Artificial intelligence,Natural language processing,Feature learning,Machine learning |
DocType | Volume | Citations |
Conference | P14-1 | 9 |
PageRank | References | Authors |
0.51 | 19 | 3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Shixiang Lu | 1 | 19 | 3.39 |
Zhenbiao Chen | 2 | 33 | 5.14 |
Bo Xu | 3 | 241 | 36.59 |