Abstract | ||
---|---|---|
Learning with noisy labels is an important and challenging task for training accurate deep neural networks. Some commonly-used loss functions, such as Cross Entropy (CE), suffer from severe overfitting to noisy labels. Robust loss functions that satisfy the symmetric condition were tailored to remedy this problem, which however encounter the underfitting effect. In this paper, we theoretically pro... |
Year | DOI | Venue |
---|---|---|
2021 | 10.1109/ICCV48922.2021.00014 | 2021 IEEE/CVF International Conference on Computer Vision (ICCV) |
Keywords | DocType | ISBN |
Training,Deep learning,Computer vision,Neural networks,Fitting,Robustness,Entropy | Conference | 978-1-6654-2812-5 |
Citations | PageRank | References |
1 | 0.38 | 0 |
Authors | ||
6 |
Name | Order | Citations | PageRank |
---|---|---|---|
Xiong Zhou | 1 | 1 | 1.39 |
Xianming Liu | 2 | 461 | 47.55 |
Chenyang Wang | 3 | 1 | 0.38 |
Deming Zhai | 4 | 34 | 4.13 |
Junjun Jiang | 5 | 1138 | 74.49 |
Xiangyang Ji | 6 | 533 | 73.14 |