Title
Fast discrete cross-modal hashing with semantic consistency.
Abstract
Supervised cross-modal hashing has attracted widespread concentrations for large-scale retrieval task due to its promising retrieval performance. However, most existing works suffer from some of following issues. Firstly, most of them only leverage the pair-wise similarity matrix to learn hash codes, which may result in class information loss. Secondly, the pair-wise similarity matrix generally lead to high computing complexity and memory cost. Thirdly, most of them relax the discrete constraints during optimization, which generally results in large cumulative quantization error and consequent inferior hash codes. To address above problems, we present a Fast Discrete Cross-modal Hashing method in this paper, FDCH for short. Specifically, it firstly leverages both class labels and the pair-wise similarity matrix to learn a sharing Hamming space where the semantic consistency can be better preserved. Then we propose an asymmetric hash codes learning model to avoid the challenging issue of symmetric matrix factorization. Finally, an effective and efficient discrete optimal scheme is designed to generate discrete hash codes directly, and the computing complexity and memory cost caused by the pair-wise similarity matrix are reduced from O(n2) to O(n), where n denotes the size of training set. Extensive experiments conducted on three real world datasets highlight the superiority of FDCH compared with several cross-modal hashing methods and demonstrate its effectiveness and efficiency.
Year
DOI
Venue
2020
10.1016/j.neunet.2020.01.035
Neural Networks
Keywords
DocType
Volume
Cross-modal retrieval,Semantic consistency,Discrete optimization,Hashing
Journal
125
Issue
ISSN
Citations 
1
0893-6080
0
PageRank 
References 
Authors
0.34
28
7
Name
Order
Citations
PageRank
Tao Yao1395.33
LianShan Yan26414.51
Yilan Ma300.34
Hong Yu400.34
Qingtang Su517616.90
Gang Wang634497.03
Qi Tian700.34