Title | ||
---|---|---|
Tagging-by-search: automatic image region labeling using gaze information obtained from image search |
Abstract | ||
---|---|---|
Labeled image regions provide very valuable information that can be used in different settings such as image search. The manual creation of region labels is a tedious task. Fully automatic approaches lack understanding the image content sufficiently due to the huge variety of depicted objects. Our approach benefits from the expected spread of eye tracking hardware and uses gaze information obtained from users performing image search tasks to automatically label image regions. This allows to exploit the human capabilities regarding the visual perception of image content while performing daily routine tasks. In an experiment with 23 participants, we show that it is possible to assign search terms to photo regions by means of gaze analysis with an average precision of 0.56 and an average F-measure of 0.38 over 361 photos. The participants performed different search tasks while their gaze was recorded. The results of the experiment show that the gaze-based approach performs significantly better than a baseline approach based on saliency maps. |
Year | DOI | Venue |
---|---|---|
2014 | 10.1145/2557500.2557517 | IUI |
Keywords | Field | DocType |
automatic image region,different search task,approach benefit,image region,baseline approach,image search task,image search,image content,search term,automatic approach,labeled image region,region labeling,eye tracking | Computer vision,Gaze,Salience (neuroscience),Computer science,Image content,Exploit,Eye tracking,Artificial intelligence,Connected-component labeling,Multimedia,Visual perception | Conference |
Citations | PageRank | References |
0 | 0.34 | 26 |
Authors | ||
3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Tina Walber | 1 | 16 | 3.74 |
Chantal Neuhaus | 2 | 1 | 1.04 |
Ansgar Scherp | 3 | 673 | 80.74 |