Title
Generating annotations for how-to videos using crowdsourcing
Abstract
How-to videos can be valuable for learning, but searching for and following along with them can be difficult. Having labeled events such as the tools used in how-to videos could improve video indexing, searching, and browsing. We introduce a crowdsourcing annotation tool for Photoshop how-to videos with a three-stage method that consists of: (1) gathering timestamps of important events, (2) labeling each event, and (3) capturing how each event affects the task of the tutorial. Our ultimate goal is to generalize our method to be applied to other domains of how-to videos. We evaluate our annotation tool with Amazon Mechanical Turk workers to investigate the accuracy, costs, and feasibility of our three-stage method for annotating large numbers of video tutorials. Improvements can be made for stages 1 and 3, but stage 2 produces accurate labels over 90% of the time using majority voting. We have observed that changes in the instructions and interfaces of each task can improve the accuracy of the results significantly.
Year
DOI
Venue
2013
10.1145/2468356.2468506
CHI Extended Abstracts
Keywords
Field
DocType
accurate label,amazon mechanical turk worker,photoshop how-to video,generating annotation,important event,how-to video,video indexing,crowdsourcing annotation tool,video tutorial,three-stage method,annotation tool
Annotation,Crowdsourcing,Computer science,Search engine indexing,Human–computer interaction,Timestamp,Majority rule,Multimedia
Conference
Citations 
PageRank 
References 
5
0.49
4
Authors
3
Name
Order
Citations
PageRank
Phu Nguyen170.91
Juho Kim263268.72
Robert C. Miller34412326.00