Title
Exploitation of Gaze Data for Photo Region Labeling in an Immersive Environment
Abstract
Metadata describing the content of photos are of high importance for applications like image search or as part of training sets for object detection algorithms. In this work, we apply tags to image regions for a more detailed description of the photo semantics. This region labeling is performed without additional effort from the user, just from analyzing eye tracking data, recorded while users are playing a gaze-controlled game. In the game EyeGrab, users classify and rate photos falling down the screen. The photos are classified according to a given category under time pressure. The game has been evaluated in a study with 54 subjects. The results show that it is possible to assign the given categories to image regions with a precision of up to 61%. This shows that we can perform an almost equally good region labeling using an immersive environment like in EyeGrab compared to a previous classification experiment that was much more controlled.
Year
DOI
Venue
2014
10.1007/978-3-319-04114-8_36
MMM
Field
DocType
Citations 
Object detection,Metadata,Computer vision,Gaze,Computer science,Eye tracking,Artificial intelligence,Immersion (virtual reality),Connected-component labeling,Semantics
Conference
1
PageRank 
References 
Authors
0.35
14
3
Name
Order
Citations
PageRank
Tina Walber1163.74
Ansgar Scherp267380.74
Steffen Staab36658593.89