Title
Click-through-based Subspace Learning for Image Search
Abstract
One of the fundamental problems in image search is to rank image documents according to a given textual query. We address two limitations of the existing image search engines in this paper. First, there is no straightforward way of comparing textual keywords with visual image content. Image search engines therefore highly depend on the surrounding texts, which are often noisy or too few to accurately describe the image content. Second, ranking functions are trained on query-image pairs labeled by human labelers, making the annotation intellectually expensive and thus cannot be scaled~up. We demonstrate that the above two fundamental challenges can be mitigated by jointly exploring the subspace learning and the use of click-through data. The former aims to create a latent subspace with the ability in comparing information from the original incomparable views (i.e., textual and visual views), while the latter explores the largely available and freely accessible click-through data (i.e., \"crowdsourced\" human intelligence) for understanding query. Specifically, we investigate a series of click-through-based subspace learning techniques (CSL) for image search. We conduct experiments on MSR-Bing Grand Challenge and the final evaluation performance achieves DCG@25=0.47225. Moreover, the feature dimension is significantly reduced by several orders of magnitude (e.g., from thousands to tens).
Year
DOI
Venue
2014
10.1145/2647868.2656404
ACM Multimedia 2001
Keywords
Field
DocType
subspace learning,image search,click-through data,dnn image representation,retrieval models
Computer vision,Click-through rate,Automatic image annotation,Annotation,Search engine,Ranking,Subspace topology,Feature detection (computer vision),Computer science,Artificial intelligence,Feature Dimension
Conference
Citations 
PageRank 
References 
12
0.54
9
Authors
5
Name
Order
Citations
PageRank
Yingwei Pan135723.66
Ting Yao284252.62
Xinmie Tian348738.43
Houqiang Li42090172.30
C. W. Ngo54271211.46