Title
Perturbed Model Validation: A New Framework to Validate Model Relevance.
Abstract
This paper introduces PMV (Perturbed Model Validation), a new technique to validate model relevance and detect overfitting or underfitting. PMV operates by injecting noise to the training data, re-training the model against the perturbed data, then using the training accuracy decrease rate to assess model relevance. A larger decrease rate indicates better concept-hypothesis fit. We realise PMV by using label flipping to inject noise, and evaluate it on four real-world datasets (breast cancer, adult, connect-4, and MNIST) and three synthetic datasets in the binary classification setting. The results reveal that PMV selects models more precisely and in a more stable way than cross-validation, and effectively detects both overfitting and underfitting.
Year
Venue
DocType
2019
arXiv: Learning
Journal
Volume
Citations 
PageRank 
abs/1905.10201
0
0.34
References 
Authors
0
5
Name
Order
Citations
PageRank
Jie M. Zhang100.34
Earl T. Barr246815.46
Benjamin Guedj398.82
Mark Harman410264389.82
John Shawe-Taylor5118791518.73