Abstract | ||
---|---|---|
The analysis of eye tracking data often requires the annotation of areas of interest (AOIs) to derive semantic interpretations of human viewing behavior during experiments. This annotation is typically the most time-consuming step of the analysis process. Especially for data from wearable eye tracking glasses, every independently recorded video has to be annotated individually and corresponding AOIs between videos have to be identified. We provide a novel visual analytics approach to ease this annotation process by image-based, automatic clustering of eye tracking data integrated in an interactive labeling and analysis system. The annotation and analysis are tightly coupled by multiple linked views that allow for a direct interpretation of the labeled data in the context of the recorded video stimuli. The components of our analytics environment were developed with a user-centered design approach in close cooperation with an eye tracking expert. We demonstrate our approach with eye tracking data from a real experiment and compare it to an analysis of the data by manual annotation of dynamic AOIs. Furthermore, we conducted an expert user study with 6 external eye tracking researchers to collect feedback and identify analysis strategies they used while working with our application. |
Year | DOI | Venue |
---|---|---|
2017 | 10.1109/TVCG.2016.2598695 | IEEE Trans. Vis. Comput. Graph. |
Keywords | Field | DocType |
Gaze tracking,Videos,Mobile communication,Labeling,Data visualization,Visual analytics | Computer vision,Data visualization,Annotation,Wearable computer,Computer science,Visual analytics,Eye tracking,Video tracking,Artificial intelligence,Cluster analysis,Analytics | Journal |
Volume | Issue | ISSN |
23 | 1 | 1077-2626 |
Citations | PageRank | References |
8 | 0.45 | 23 |
Authors | ||
4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Kuno Kurzhals | 1 | 227 | 20.63 |
Marcel Hlawatsch | 2 | 128 | 9.80 |
Christof Seeger | 3 | 8 | 0.45 |
Daniel Weiskopf | 4 | 2988 | 204.30 |