Title
Uncertainty-Driven Black-Box Test Data Generation
Abstract
We can never be certain that a software system is correct simply by testing it, but with every additional successful test we become less uncertain about its correctness. In absence of source code or elaborate specifications and models, tests are usually generated or chosen randomly. However, rather than randomly choosing tests, it would be preferable to choose those tests that decrease our uncertainty about correctness the most. In order to guide test generation, we apply what is referred to in Machine Learning as "Query Strategy Framework": We infer a behavioural model of the system under test and select those tests which the inferred model is "least certain" about. Running these tests on the system under test thus directly targets those parts about which tests so far have failed to inform the model. We provide an implementation that uses a genetic programming engine for model inference in order to enable an uncertainty sampling technique known as "query by committee", and evaluate it on eight subject systems from the Apache Commons Math framework and JodaTime. The results indicate that test generation using uncertainty sampling outperforms conventional and Adaptive Random Testing.
Year
DOI
Venue
2017
10.1109/ICST.2017.30
2017 IEEE International Conference on Software Testing, Verification and Validation (ICST)
Keywords
DocType
Volume
black-box testing,test generation,machine learning,uncertainty sampling,genetic programming
Conference
abs/1608.03181
ISSN
ISBN
Citations 
2381-2834
978-1-5090-6032-0
3
PageRank 
References 
Authors
0.40
26
2
Name
Order
Citations
PageRank
Neil Walkinshaw134527.27
Gordon Fraser22625116.22