Title
Novel tight classification error bounds under mismatch conditions based on f-Divergence
Abstract
By default, statistical classification/multiple hypothesis testing is faced with the model mismatch introduced by replacing the true distributions in Bayes decision rule by model distributions estimated on training samples. Although a large number of statistical measures exist w.r.t. to the mismatch introduced, these works rarely relate to the mismatch in accuracy, i.e. the difference between model error and Bayes error. In this work, the accuracy mismatch between the ideal Bayes decision rule/Bayes test and a mismatched decision rule in statistical classification/multiple hypothesis testing is investigated explicitly. A proof of a novel generalized tight statistical bound on the accuracy mismatch is presented. This result is compared to existing statistical bounds related to the total variational distance that can be extended to bounds of the accuracy mismatch. The analytic results are supported by distribution simulations.
Year
DOI
Venue
2013
10.1109/ITW.2013.6691302
Information Theory Workshop
Keywords
Field
DocType
Bayes methods,error statistics,pattern classification,statistical analysis,Bayes decision rule,Bayes error,Bayes test,accuracy mismatch,f-divergence,mismatched decision,model distributions,model error,model mismatch,multiple hypothesis testing,novel generalized tight statistical bound,statistical classification,statistical measures,total variational distance,training samples
Decision rule,Discrete mathematics,Bayes factor,Multiple comparisons problem,Algorithm,Statistical classification,Statistics,f-divergence,Bayes error rate,Bayes classifier,Mathematics,Bayes' theorem
Conference
ISBN
Citations 
PageRank 
978-1-4799-1321-3
0
0.34
References 
Authors
0
6
Name
Order
Citations
PageRank
Ralf Schlüter11337136.18
Markus Nußbaum-Thom2737.00
Eugen Beck3184.77
Tamer Alkhouli4757.56
Hermann Ney5141781506.93
Nussbaum-Thom, M.600.34