Abstract | ||
---|---|---|
Industrial practitioners now face a bewildering array of possible configurations for effort estimation. How to select the best one for a particular dataset? This paper introduces OIL (short for optimized learning), a novel configuration tool for effort estimation based on differential evolution. When tested on 945 software projects, OIL significantly improved effort estimations, after exploring just a few configurations (just a few dozen). Further OILu0027s results are far better than two methods in widespread use: estimation-via-analogy and a recent state-of-the-art baseline published at TOSEMu002715 by Whigham et al. Given that the computational cost of this approach is so low, and the observed improvements are so large, we conclude that SBSE should be a standard component of software effort estimation. |
Year | Venue | Field |
---|---|---|
2018 | arXiv: Software Engineering | Industrial engineering,Systems engineering,Computer science,Differential evolution,Software |
DocType | Volume | Citations |
Journal | abs/1804.00626 | 2 |
PageRank | References | Authors |
0.36 | 35 | 5 |
Name | Order | Citations | PageRank |
---|---|---|---|
Tianpei Xia | 1 | 4 | 1.39 |
Jianfeng Chen | 2 | 14 | 3.79 |
George Mathew | 3 | 4 | 2.47 |
Xipeng Shen | 4 | 7 | 2.82 |
Tim Menzies | 5 | 2886 | 151.44 |