Abstract | ||
---|---|---|
Domain adaptation is the situation for supervised learning in which the training data are sampled from the source domain while the test data are sampled from the target domain that follows a different distribution. The key to solving such a problem is to reduce effects of the discrepancy between the training data and test data. Recently, deep learning methods that employ stacked denoising auto-encoders (SDAs) to learn new representations for both domains have been successfully applied in domain adaptation. And, remarkable performance on multi-domain sentiment analysis datasets has been reported, making deep learning a promising approach to domain adaptation problems. In this paper, a deep learning method called Stacked Robust Adaptively Regularized Auto-regressions (SRARAs) is proposed to learn useful representations for domain adaptation problems. Each layer of SRARAs contains two steps: a linear transformation step, which is based on robust adaptively regularized auto-regression, and a non-linear squashing transformation step. The first step aims at reducing the discrepancy between the training data and test data, and the second step is to introduce non-linearity and control the range of the elements in the outputs. The experimental results on text and image datasets demonstrate that the proposed method is very effective. |
Year | DOI | Venue |
---|---|---|
2019 | 10.1109/TKDE.2018.2837085 | IEEE Trans. Knowl. Data Eng. |
Keywords | Field | DocType |
Machine learning,Robustness,Training,Noise reduction,Adaptation models,Learning systems,Training data | Noise reduction,Computer science,Domain adaptation,Sentiment analysis,Supervised learning,Robustness (computer science),Test data,Linear map,Artificial intelligence,Deep learning,Machine learning | Journal |
Volume | Issue | ISSN |
31 | 3 | 1041-4347 |
Citations | PageRank | References |
2 | 0.37 | 0 |
Authors | ||
6 |
Name | Order | Citations | PageRank |
---|---|---|---|
Wenhao Jiang | 1 | 32 | 3.91 |
Hongchang Gao | 2 | 54 | 8.32 |
Wei Lu | 3 | 534 | 49.25 |
Wei Liu | 4 | 4041 | 204.19 |
Fu Lai Chung | 5 | 1534 | 86.72 |
Heng Huang | 6 | 3080 | 203.21 |