Abstract | ||
---|---|---|
Testing is a fundamental task to ensure software quality. Regression testing aims to ensure that changes to software do not introduce new failures. As resources are often limited and testing comprises a vast amount of test cases, different regression strategies have been proposed to reduce testing effort by selecting or prioritizing important test cases, e.g., code coverage (to ensure a sufficient testing depth). However, in system testing, source code is often not available creating a black-box system. In this paper, we introduce an automated, multi-objective test case selection technique in black-box systems using genetic algorithms. We define seven different objectives, based on meta-data, allowing a flexible test case selection for a variety of systems. For evaluation, we apply our technique on two different subject systems assessing the feasibility and suitability of our test case selection approach. Results indicate that our approach is applicable based on different data available and is able to outperform random test case selection and retest-all. |
Year | DOI | Venue |
---|---|---|
2017 | 10.1145/3071178.3071189 | GECCO |
Keywords | Field | DocType |
Test Case Selection, Search-based Testing, Black-Box Testing, System Testing | Test Management Approach,Computer science,Manual testing,Test script,White-box testing,Non-regression testing,Regression testing,Artificial intelligence,Test strategy,Machine learning,Keyword-driven testing | Conference |
Citations | PageRank | References |
2 | 0.36 | 27 |
Authors | ||
6 |
Name | Order | Citations | PageRank |
---|---|---|---|
Remo Lachmann | 1 | 44 | 3.86 |
Michael Felderer | 2 | 538 | 78.87 |
Manuel Nieke | 3 | 8 | 1.08 |
Sandro Schulze | 4 | 259 | 23.43 |
Christoph Seidl | 5 | 207 | 20.15 |
Ina Schaefer | 6 | 1634 | 99.16 |