Title
Learning Distinctive Margin toward Active Domain Adaptation
Abstract
Despite plenty of efforts focusing on improving the domain adaptation ability (DA) under unsupervised or few-shot semi-supervised settings, recently the solution of active learning started to attract more attention due to its suitability in transferring model in a more practical way with limited annotation resource on target data. Nevertheless, most active learning methods are not inherently designed to handle domain gap between data distribution, on the other hand, some active domain adaptation methods (ADA) usually requires complicated query functions, which is vulnerable to overfitting. In this work, we propose a concise but effective ADA method called Select-by-Distinctive-Margin (SDM), which consists of a maximum margin loss and a margin sampling algorithm for data selection. We provide theoretical analysis to show that SDM works like a Support Vector Machine, storing hard examples around decision boundaries and exploiting them to find informative and transferable data. In addition, we propose two variants of our method, one is designed to adaptively adjust the gradient from margin loss, the other boosts the selectivity of margin sampling by taking the gradient direction into account. We benchmark SDM with standard active learning setting, demonstrating our algorithm achieves competitive results with good data scalability. Code is available at https://github.com/TencentYoutuResearch/ActiveLearning-SDM
Year
DOI
Venue
2022
10.1109/CVPR52688.2022.00783
IEEE Conference on Computer Vision and Pattern Recognition
Keywords
DocType
Volume
Transfer/low-shot/long-tail learning, Machine learning, Self-& semi-& meta- & unsupervised learning
Conference
2022
Issue
Citations 
PageRank 
1
0
0.34
References 
Authors
0
9
Name
Order
Citations
PageRank
Ming Xie100.34
Yuxi Li200.34
Yabiao Wang3217.05
Zekun Luo441.05
Zhenye Gan500.34
Zhongyi Sun6142.52
Mingmin Chi748835.97
Chengjie Wang84319.03
Pei Wang900.34