Title
ModelTracker: Redesigning Performance Analysis Tools for Machine Learning
Abstract
Model building in machine learning is an iterative process. The performance analysis and debugging step typically involves a disruptive cognitive switch from model building to error analysis, discouraging an informed approach to model building. We present ModelTracker, an interactive visualization that subsumes information contained in numerous traditional summary statistics and graphs while displaying example-level performance and enabling direct error examination and debugging. Usage analysis from machine learning practitioners building real models with ModelTracker over six months shows ModelTracker is used often and throughout model building. A controlled experiment focusing on ModelTracker's debugging capabilities shows participants prefer ModelTracker over traditional tools without a loss in model performance.
Year
DOI
Venue
2015
10.1145/2702123.2702509
CHI
Keywords
Field
DocType
user interfaces,interactive visualization,debugging,machine learning,performance analysis
Analysis tools,Graph,Usage analysis,Iterative and incremental development,Computer science,Model building,Interactive visualization,Human–computer interaction,Artificial intelligence,Machine learning,Debugging,Algorithmic program debugging
Conference
Citations 
PageRank 
References 
57
2.10
15
Authors
6
Name
Order
Citations
PageRank
Saleema Amershi177545.16
Max Chickering2572.78
steven m drucker32399286.15
Bongshin Lee42738143.95
Patrice Y. Simard51112155.00
Jina Suh617810.04