Abstract | ||
---|---|---|
In the domain adaptation research, which recently becomes one of the most important research directions in machine learning, source and target domains are with different underlying distributions. In this paper, we propose an ensemble learning framework for domain adaptation. Owing to the distribution differences between source and target domains, the weights in the final model are sensitive to target examples. As a result, our method aims to dynamically assign weights to different test examples by making use of additional classifiers called model-friendly classifiers. The model-friendly classifiers can judge which base models predict well on a specific test example. Finally, the model can give the most favorable weights to different examples. In the experiments, we firstly testify the need of dynamical weights in the ensemble learning based domain adaptation, then compare our method with other classical methods on real datasets. The experimental results show that our method can learn a final model performing well in the target domain. |
Year | Venue | Keywords |
---|---|---|
2012 | ICPR | distribution differences,ensemble learning framework,learning (artificial intelligence),pattern classification,dynamical ensemble learning,domain adaptation,dynamic weight assignment,machine learning,model-friendly classifier,learning artificial intelligence |
Field | DocType | ISSN |
Online machine learning,Semi-supervised learning,Pattern recognition,Active learning (machine learning),Domain adaptation,Computer science,Artificial intelligence,Generalization error,Ensemble learning,Machine learning | Conference | 1051-4651 |
ISBN | Citations | PageRank |
978-1-4673-2216-4 | 9 | 0.52 |
References | Authors | |
5 | 2 |
Name | Order | Citations | PageRank |
---|---|---|---|
Wenting Tu | 1 | 85 | 9.48 |
Shiliang Sun | 2 | 1732 | 115.55 |