Title
An efficient framework for location-based scene matching in image databases.
Abstract
SIFT-based methods have been widely used for scene matching of photos taken at particular locations or places of interest. These methods are typically very time consuming due to the large number and high dimensionality of features used, making them unfeasible for use in consumer image collections containing a large number of images where computational power is limited and a fast response is desired. Considerable computational savings can be realized if images containing signature elements of particular locations can be automatically identified from the large number of images and only these representative images used for scene matching. We propose an efficient framework incorporating a set of discriminative image features that effectively enables us to select representative images for fast location-based scene matching. These image features are used for classifying images into good or bad candidates for scene matching, using different classification approaches. Furthermore, the image features created from our framework can facilitate the process of using sub-images for location-based scene matching with SIFT features. The experimental results demonstrate the effectiveness of our approach compared with the traditional SIFT-, PCA-SIFT-, and SURF-based approaches by reducing the computational time by an order of magnitude.
Year
DOI
Venue
2012
10.1007/s13735-012-0011-7
IJMIR
Keywords
Field
DocType
Scene matching, SIFT, Clustering, Image search and retrieval, Face detection, Occlusion, Blur, Classification
Scale-invariant feature transform,Computer vision,Pattern recognition,Feature (computer vision),Computer science,Scene matching,Curse of dimensionality,Artificial intelligence,Face detection,Cluster analysis,Discriminative model,Machine learning
Journal
Volume
Issue
ISSN
1
2
2192-662X
Citations 
PageRank 
References 
1
0.40
15
Authors
3
Name
Order
Citations
PageRank
Xu Chen133419.54
Madirakshi Das213125.69
Alexander C. Loui377358.76