Title
Selective Probabilistic Classifier Based on Hypothesis Testing
Abstract
In this paper, we propose a simple yet effective method to deal with the violation of the Closed-World Assumption for a classifier. Previous works tend to apply a threshold either on the classification scores or the loss function to reject the inputs that violate the assumption. However, these methods cannot achieve the low False Positive Ratio (FPR) required in safety applications. The proposed method is a rejection option based on hypothesis testing with probabilistic networks. With probabilistic networks, it is possible to estimate the distribution of outcomes instead of a single output. By utilizing Z-test over the mean and standard deviation for each class, the proposed method can estimate the statistical significance of the network certainty and reject uncertain outputs. The proposed method was experimented on with different configurations of the COCO and CIFAR datasets. The performance of the proposed method is compared with the Softmax Response, which is a known top-performing method. It is shown that the proposed method can achieve a broader range of operation and cover a lower FPR than the alternative.
Year
DOI
Venue
2021
10.1109/EUVIP50544.2021.9483967
2021 9th European Workshop on Visual Information Processing (EUVIP)
Keywords
DocType
ISSN
Selective Classifier,Probabilistic Neural Network,Statistical Analysis,Uncertainty Estimation
Conference
2164-974X
ISBN
Citations 
PageRank 
978-1-6654-3231-3
0
0.34
References 
Authors
0
3
Name
Order
Citations
PageRank
Saeed Bakhshi Germi100.34
Esa Rahtu283252.76
Heikki Huttunen324428.20