Title
Automatic Grading for Program Tracing Exercises.
Abstract
Understanding how a programming construct executes is a prerequisite for coding. Tracing exercises are often used to help students develop accurate mental models of how different constructs execute. Besides asking students for the final result, a full tracing exercise, similarly to an instructor's white-board demo, checks whether students understand the execution flow and how the memory/output changes with each step of execution. Such exercises can be time-consuming to grade or set up. For paper-based full tracing exercises, once submitted, they represent the final state of the memory/output. It is time-consuming to infer whether students have traced each step correctly. For an auto-gradable quiz, if the questions are too specific (e.g. the value of a variable), they already reveal partially what the program will do. It takes time to set up auto-gradable questions that avoid this problem. We are developing a web-based system that allows instructors to set up auto-gradable full tracing exercises easily. Our approach is to augment pythontutor.com, a popular open-source code visualization system that graphically demonstrates the execution flow and memory/output changes of a complete program. Before demonstrating a step, the new system will prompt a student to determine which line is executed next and what changes happen in memory/output. The system will be auto-gradable, making it easy to deliver full tracing exercises. We will demonstrate implemented features, discuss future plans, and gather feedback from those present.
Year
DOI
Venue
2020
10.1145/3328778.3372561
SIGCSE
DocType
ISBN
Citations 
Conference
978-1-4503-6793-6
0
PageRank 
References 
Authors
0.34
0
1
Name
Order
Citations
PageRank
Wei Jin18325.25