Title
Semantic keyword-based retrieval of photos taken with mobile devices
Abstract
This paper presents an approach for incorporating contextual metadata in a keyword-based photo retrieval process. We use our mobile annotation system PhotoMap in order to create metadata describing the photo shoot context (e.g., street address, nearby objects, season, lighting, nearby people...). These metadata are then used to generate a set of stamped words for indexing each photo. We adapt the Vector Space Model (VSM) in order to transform these shoot context words into document-vector terms. Furthermore, spatial reasoning is used for inferring new potential indexing terms. We define methods for weighting those terms and for handling a query matching. We also detail retrieval experiments carried out by using PhotoMap and Flickr geotagged photos. We illustrate the advantages of using Wikipedia georeferenced objects for indexing photos.
Year
DOI
Venue
2008
10.1145/1497185.1497226
MoMM
Keywords
Field
DocType
new potential indexing term,detail retrieval experiment,nearby people,indexing photo,flickr geotagged photo,mobile device,contextual metadata,semantic keyword-based retrieval,keyword-based photo retrieval process,mobile annotation system photomap,photo shoot context,nearby object,seasonality,indexation,indexing terms,vector space model,image retrieval,spatial reasoning
Metadata,Spatial intelligence,Weighting,Annotation,Information retrieval,Computer science,Search engine indexing,Context awareness,Mobile device,Vector space model
Conference
Citations 
PageRank 
References 
4
0.45
17
Authors
6
Name
Order
Citations
PageRank
Windson Viana120128.40
Samira Hammiche2343.94
Bogdan Moisuc3293.92
Marlène Villanova-oliver421130.47
Jérôme Gensel538050.95
Hervé Martin620422.03