Title
Learning Confidence for Out-of-Distribution Detection in Neural Networks.
Abstract
Modern neural networks are very powerful predictive models, but they are often incapable of recognizing when their predictions may be wrong. Closely related to this is the task of out-of-distribution detection, where a network must determine whether or not an input is outside of the set on which it is expected to safely perform. To jointly address these issues, we propose a method of learning confidence estimates for neural networks that is simple to implement and produces intuitively interpretable outputs. We demonstrate that on the task of out-of-distribution detection, our technique surpasses recently proposed techniques which construct confidence based on the networku0027s output distribution, without requiring any additional labels or access to out-of-distribution examples. Additionally, we address the problem of calibrating out-of-distribution detectors, where we demonstrate that misclassified in-distribution examples can be used as a proxy for out-of-distribution examples.
Year
Venue
Field
2018
arXiv: Machine Learning
Artificial intelligence,Artificial neural network,Detector,Mathematics,Machine learning
DocType
Volume
Citations 
Journal
abs/1802.04865
6
PageRank 
References 
Authors
0.42
10
2
Name
Order
Citations
PageRank
Terrance Devries11336.04
Graham W. Taylor21523127.22