Title
An efficient parallel strategy for matching visual self-similarities in large image databases
Abstract
Due to high interest of social online systems, there exists a huge and still increasing amount of image data in the web. In order to handle this massive amount of visual information, algorithms often need to be redesigned. In this work, we developed an efficient approach to find visual similarities between images that runs completely on GPU and is applicable to large image databases. Based on local self-similarity descriptors, the approach finds similarities even across modalities. Given a set of images, a database is created by storing all descriptors in an arrangement suitable for parallel GPU-based comparison. A novel voting-scheme further considers the spatial layout of descriptors with hardly any overhead. Thousands of images are searched in only a few seconds. We apply our algorithm to cluster a set of image responses to identify various senses of ambiguous words and re-tag similar images with missing tags.
Year
DOI
Venue
2012
10.1007/978-3-642-33863-2_28
ECCV Workshops (1)
Keywords
Field
DocType
efficient approach,local self-similarity descriptors,visual similarity,large image databases,image data,efficient parallel strategy,image response,re-tag similar image,ambiguous word,visual information,visual self-similarities,massive amount
Modalities,Computer vision,Data mining,Existential quantification,Computer science,Artificial intelligence,Database
Conference
Volume
ISSN
Citations 
7583
0302-9743
0
PageRank 
References 
Authors
0.34
13
3
Name
Order
Citations
PageRank
Katharina Schwarz132.74
Tobias Häußler200.34
Hendrik P. A. Lensch3147196.59