Abstract | ||
---|---|---|
Deep neural networks tend to underestimate uncertainty and produce overly confident predictions. Recently proposed solutions, such as MC Dropout and SDENet, require complex training and/or auxiliary out-of-distribution data. We propose a simple solution by extending the time-tested iterative reweighted least square (IRLS) in generalised linear regression. We use two sub-networks to parametrise the prediction and uncertainty estimation, enabling easy handling of complex inputs and nonlinear response. The two sub-networks have shared representations and are trained via two complementary loss functions for the prediction and the uncertainty estimates, with interleaving steps as in a cooperative game. Compared with more complex models such as MC-Dropout or SDE-Net, our proposed network is simpler to implement and more robust (insensitive to varying aleatoric and epistemic uncertainty). |
Year | DOI | Venue |
---|---|---|
2021 | 10.1109/IJCNN52387.2021.9533617 | 2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN) |
DocType | ISSN | Citations |
Conference | 2161-4393 | 0 |
PageRank | References | Authors |
0.34 | 0 | 4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Akib Mashrur | 1 | 0 | 0.34 |
Wei Luo | 2 | 109 | 14.13 |
Nayyar Abbas Zaidi | 3 | 91 | 9.88 |
Antonio Robles-Kelly | 4 | 991 | 79.88 |