Title
Find you wherever you are: geographic location and environment context-based pedestrian detection
Abstract
Most existing approaches to pedestrian detection only use the visual appearances as the main source in real world images. However, the visual information cannot always provide reliable guidance since pedestrians often change pose or wear different clothes under different conditions. In this work, by leveraging a vast amount of Web images, we first construct a contextual image database, in which each image is automatically attached with geographic location (i.e., latitude and longitude) and environment information (i.e., season, time and weather condition), assisted by image metadata and a few pre-trained classifiers. For the further pedestrian detection, an annotation scheme is presented which can sharply decrease manual labeling efforts. Several properties of the contextual image database are studied including whether the database is authentic and helpful for pedestrian detection. Moreover, we propose a context-based pedestrian detection approach by jointly exploring visual and contextual cues in a probabilistic model. Encouraging results are reported on our contextual image database.
Year
DOI
Venue
2012
10.1145/2390790.2390801
GeoMM@ACM Multimedia
Keywords
Field
DocType
contextual cue,visual appearance,web image,contextual image database,image metadata,pedestrian detection,context-based pedestrian detection approach,real world image,different clothes,visual information,geographic location,bayesian network
Computer vision,Metadata,Annotation,Object-class detection,Computer science,Geographic coordinate system,Bayesian network,Artificial intelligence,Statistical model,Contextual image classification,Pedestrian detection
Conference
Citations 
PageRank 
References 
0
0.34
20
Authors
4
Name
Order
Citations
PageRank
Yuan Liu121511.43
Zhongchao Shi2245.65
Gang Wang328265.93
Haike Guan400.68