Title
Probabilistic Model Distillation for Semantic Correspondence
Abstract
Semantic correspondence is a fundamental problem in computer vision, which aims at establishing dense correspondences across images depicting different instances under the same category. This task is challenging due to large intra-class variations and a severe lack of ground truth. A popular solution is to learn correspondences from synthetic data. However, because of the limited intra-class appearance and background variations within synthetically generated training data, the model's capability for handling "real" image pairs using such strategy is intrinsically constrained. We address this problem with the use of a novel Probabilistic Model Distillation (PMD) approach which transfers knowledge learned by a probabilistic teacher model on synthetic data to a static student model with the use of unlabeled real image pairs. A probabilistic supervision reweighting (PSR) module together with a confidence-aware loss (CAL) is used to mine the useful knowledge and alleviate the impact of errors. Experimental results on a variety of benchmarks show that our PMD achieves state-of-the-art performance. To demonstrate the generalizability of our approach, we extend PMD to incorporate stronger supervision for better accuracy - the probabilistic teacher is trained with stronger key-point supervision. Again, we observe the superiority of our PMD. The extensive experiments verify that PMD is able to infer more reliable supervision signals from the probabilistic teacher for representation learning and largely alleviate the influence of errors in pseudo labels. Cade is avaliable at https://github.com/fanyang587/PMD.
Year
DOI
Venue
2021
10.1109/CVPR46437.2021.00742
2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021
DocType
ISSN
Citations 
Conference
1063-6919
0
PageRank 
References 
Authors
0.34
0
6
Name
Order
Citations
PageRank
Xin Li1123.93
Deng-Ping Fan200.34
Fan Yang333.08
Ao Luo401.35
Hong Cheng570365.27
zicheng liu63662199.64