Title
SnapCheck: Automated Testing for Snap! Programs
Abstract
ABSTRACTProgramming environments such as Snap, Scratch, and Processing engage learners by allowing them to create programming artifacts such as apps and games, with visual and interactive output. Learning programming with such a media-focused context has been shown to increase retention and success rate. However, assessing these visual, interactive projects requires time and laborious manual effort, and it is therefore difficult to offer automated or real-time feedback to students as they work. In this paper, we introduce SnapCheck, a dynamic testing framework for Snap that enables instructors to author test cases with Condition-Action templates. The goal of SnapCheck is to allow instructors or researchers to author property-based test cases that can automatically assess students' interactive programs with high accuracy. Our evaluation of SnapCheck on 162 code snapshots from a Pong game assignment in an introductory programming course shows that our automated testing framework achieves at least 98% accuracy over all rubric items, showing potentials to use SnapCheck for auto-grading and providing formative feedback to students.
Year
DOI
Venue
2021
10.1145/3430665.3456367
Innovation and Technology in Computer Science Education
DocType
Citations 
PageRank 
Conference
1
0.35
References 
Authors
0
5
Name
Order
Citations
PageRank
Wengran Wang122.05
Chenhao Zhang211.03
Andreas Stahlbauer310.35
Gordon Fraser42625116.22
Thomas W. Price57619.27