Title
Deep Unsupervised Hybrid-similarity Hadamard Hashing
Abstract
Hashing has become increasingly important for large-scale image retrieval. Recently, deep supervised hashing has shown promising performance, yet little work has been done under the more realistic unsupervised setting. The most challenging problem in unsupervised hashing methods is the lack of supervised information. Besides, existing methods fail to distinguish image pairs with different similarity degrees, which leads to a suboptimal construction of similarity matrix. In this paper, we propose a simple yet effective unsupervised hashing method, dubbed Deep Unsupervised Hybrid-similarity Hadamard Hashing (DU3H), which tackles these issues in an end-to-end deep hashing framework. DU3H employs orthogonal Hadamard codes to provide auxiliary supervised information in unsupervised setting, which can maximally satisfy the independence and balance properties of hash codes. Moreover, DU3H utilizes both highly and normally confident image pairs to jointly construct a hybrid-similarity matrix, which can magnify the impacts of different pairs to better preserve the semantic relations between images. Extensive experiments conducted on three widely used benchmarks validate the superiority of DU3H.
Year
DOI
Venue
2020
10.1145/3394171.3414028
MM '20: The 28th ACM International Conference on Multimedia Seattle WA USA October, 2020
DocType
ISBN
Citations 
Conference
978-1-4503-7988-5
1
PageRank 
References 
Authors
0.35
0
6
Name
Order
Citations
PageRank
Wanqian Zhang154.11
Dayan Wu2157.33
Yu Zhou39822.73
Bo Li42610.93
Weiping Wang579.20
Dan Meng63716.11