Title
Automatically identifying targets users interact with during real world tasks
Abstract
Information about the location and size of the targets that users interact with in real world settings can enable new innovations in human performance assessment and soft-ware usability analysis. Accessibility APIs provide some information about the size and location of targets. How-ever this information is incomplete because it does not sup-port all targets found in modern interfaces and the reported sizes can be inaccurate. These accessibility APIs access the size and location of targets through low-level hooks to the operating system or an application. We have developed an alternative solution for target identification that leverages visual affordances in the interface, and the visual cues produced as users interact with targets. We have used our novel target identification technique in a hybrid solution that combines machine learning, computer vision, and accessibility API data to find the size and location of targets users select with 89% accuracy. Our hybrid approach is superior to the performance of the accessibility API alone: in our dataset of 1355 targets covering 8 popular applications, only 74% of the targets were correctly identified by the API alone.
Year
DOI
Venue
2010
10.1145/1719970.1719973
IUI
Keywords
Field
DocType
accessibility api,targets users interact,accessibility apis access,accessibility apis,reported size,hybrid approach,targets user,alternative solution,human performance assessment,users interact,accessibility api data,real world task,operating system,human performance,machine learning,visual cues
Sensory cue,Computer science,Usability,Human–computer interaction,Multimedia,Affordance
Conference
Citations 
PageRank 
References 
34
1.30
15
Authors
3
Name
Order
Citations
PageRank
Amy Hurst126023.69
Scott Hudson26564910.06
Jennifer Mankoff32727230.05