Title
Striving for failure: an industrial case study about test failure prediction
Abstract
Software regression testing is an important, yet very costly, part of most major software projects. When regression tests run, any failures that are found help catch bugs early and smooth the future development work. The act of executing large numbers of tests takes significant resources that could, otherwise, be applied elsewhere. If tests could be accurately classified as likely to pass or fail prior to the run, it could save significant time while maintaining the benefits of early bug detection. In this paper, we present a case study to build a classifier for regression tests based on industrial software, Microsoft Dynamics AX. In this study, we examine the effectiveness of this classification as well as which aspects of the software are the most important in predicting regression test failures.
Year
DOI
Venue
2015
10.1109/ICSE.2015.134
ICSE
Keywords
Field
DocType
Test failure prediction, data-mining software repositories, regression testing, case study
Smoke testing (software),Risk-based testing,Computer science,Regression testing,Software reliability testing,Acceptance testing,Software verification and validation,Software construction,Reliability engineering,Software regression
Conference
Volume
ISSN
ISBN
2
0270-5257
978-1-4799-1934-5
Citations 
PageRank 
References 
4
0.39
17
Authors
3
Name
Order
Citations
PageRank
Jeff Anderson1234.05
Saeed Salem218217.39
Hyunsook Do3129056.38