Title
Bootstrap analysis of multiple repetitions of experiments using an interval-valued multiple comparison procedure
Abstract
A new bootstrap test is introduced that allows for assessing the significance of the differences between stochastic algorithms in a cross-validation with repeated folds experimental setup. Intervals are used for modeling the variability of the data that can be attributed to the repetition of learning and testing stages over the same folds in cross validation. Numerical experiments are provided that support the following three claims: (1) Bootstrap tests can be more powerful than ANOVA or Friedman test for comparing multiple classifiers. (2) In the presence of outliers, interval-valued bootstrap tests achieve a better discrimination between stochastic algorithms than nonparametric tests. (3) Choosing ANOVA, Friedman or Bootstrap can produce different conclusions in experiments involving actual data from machine learning tasks.
Year
DOI
Venue
2014
10.1016/j.jcss.2013.03.009
J. Comput. Syst. Sci.
Keywords
Field
DocType
actual data,interval-valued multiple comparison procedure,better discrimination,choosing anova,stochastic algorithm,cross validation,interval-valued bootstrap test,multiple repetition,bootstrap analysis,new bootstrap test,bootstrap test,friedman test,nonparametric test
Friedman test,Outlier,Multiple comparisons problem,Nonparametric statistics,Bootstrap aggregating,Statistics,Cross-validation,Mathematics,Bootstrapping (electronics),Analysis of variance
Journal
Volume
Issue
ISSN
80
1
0022-0000
Citations 
PageRank 
References 
5
0.64
11
Authors
4
Name
Order
Citations
PageRank
José Otero155224.66
Luciano Sánchez237726.34
Inés Couso385069.91
Ana Palacios461.04