Abstract | ||
---|---|---|
Background: Continuously calibrated and validated parametric models are necessary for realistic software estimates. However, in practice, variations in model adoption and usage patterns introduce a great deal of local bias in the resultant historical data. Such local bias should be carefully examined and addressed before the historical data can be used for calibrating new versions of parametric models. Aims: In this study, we aim at investigating the degree of such local bias in a cross-company historical dataset, and assessing its impacts on parametric estimation model's performance. Method: Our study consists of three parts: 1) defining a method for measuring and analyzing the local bias associated with individual organization data subset in the overall dataset; 2) assessing the impacts of local bias on the estimation performance of COCOMO II 2000 model; 3) performing a correlation analysis to verify that local bias can be harmful to the performance of a parametric estimation model. Results: Our results show that the local bias negatively impacts the performance of parametric model. Our measure of local bias has a positive correlation with the performance by statistical importance. Conclusion: Local calibration by using the whole multi-company data would get worse performance. The influence of multi-company data could be defined by local bias and be measured by our method. |
Year | DOI | Venue |
---|---|---|
2011 | 10.1145/2020390.2020404 | Promise |
Keywords | Field | DocType |
model adoption,parametric estimation model,multi-company data,historical data,estimation performance,local bias,parametric model,local calibration,individual organization data subset,worse performance | Econometrics,Data mining,Parametric model,Computer science,Software,Positive correlation,COCOMO,Parametric estimation,Calibration,Correlation analysis | Conference |
Citations | PageRank | References |
6 | 0.48 | 13 |
Authors | ||
7 |
Name | Order | Citations | PageRank |
---|---|---|---|
Ye Yang | 1 | 6 | 0.48 |
Lang Xie | 2 | 81 | 5.82 |
Zhimin He | 3 | 536 | 35.90 |
Qi Li | 4 | 87 | 7.18 |
Vu Nguyen | 5 | 134 | 18.35 |
Barry W. Boehm | 6 | 6849 | 1171.18 |
Ricardo Valerdi | 7 | 233 | 29.76 |