Title | ||
---|---|---|
Hot-Refresh Model Upgrades with Regression-Free Compatible Training in Image Retrieval. |
Abstract | ||
---|---|---|
The task of hot-refresh model upgrades of image retrieval systems plays an essential role in the industry but has never been investigated in academia before. Conventional cold-refresh model upgrades can only deploy new models after the gallery is overall backfilled, taking weeks or even months for massive data. In contrast, hot-refresh model upgrades deploy the new model immediately and then gradually improve the retrieval accuracy by backfilling the gallery on-the-fly. Compatible training has made it possible, however, the problem of model regression with negative flips poses a great challenge to the stable improvement of user experience. We argue that it is mainly due to the fact that new-to-old positive query-gallery pairs may show less similarity than new-to-new negative pairs. To solve the problem, we introduce a Regression-Alleviating Compatible Training (RACT) method to properly constrain the feature compatibility while reducing negative flips. The core is to encourage the new-to-old positive pairs to be more similar than both the new-to-old negative pairs and the new-to-new negative pairs. An efficient uncertainty-based backfilling strategy is further introduced to fasten accuracy improvements. Extensive experiments on large-scale retrieval benchmarks (eg, Google Landmark) demonstrate that our RACT effectively alleviates the model regression for one more step towards seamless model upgrades. |
Year | Venue | DocType |
---|---|---|
2022 | International Conference on Learning Representations (ICLR) | Conference |
Citations | PageRank | References |
0 | 0.34 | 0 |
Authors | ||
8 |
Name | Order | Citations | PageRank |
---|---|---|---|
Binjie Zhang | 1 | 0 | 0.34 |
Yixiao Ge | 2 | 0 | 4.73 |
Yantao Shen | 3 | 0 | 0.34 |
Yu Li | 4 | 483 | 24.38 |
Chun Yuan | 5 | 0 | 0.34 |
Xuyuan Xu | 6 | 0 | 0.68 |
Yexin Wang | 7 | 0 | 1.35 |
Ying Shan | 8 | 0 | 0.34 |