Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Darwinian Model Upgrades: Model Evolving with Selective Compatibility (2210.06954v1)

Published 13 Oct 2022 in cs.CV

Abstract: The traditional model upgrading paradigm for retrieval requires recomputing all gallery embeddings before deploying the new model (dubbed as "backfilling"), which is quite expensive and time-consuming considering billions of instances in industrial applications. BCT presents the first step towards backward-compatible model upgrades to get rid of backfilling. It is workable but leaves the new model in a dilemma between new feature discriminativeness and new-to-old compatibility due to the undifferentiated compatibility constraints. In this work, we propose Darwinian Model Upgrades (DMU), which disentangle the inheritance and variation in the model evolving with selective backward compatibility and forward adaptation, respectively. The old-to-new heritable knowledge is measured by old feature discriminativeness, and the gallery features, especially those of poor quality, are evolved in a lightweight manner to become more adaptive in the new latent space. We demonstrate the superiority of DMU through comprehensive experiments on large-scale landmark retrieval and face recognition benchmarks. DMU effectively alleviates the new-to-new degradation and improves new-to-old compatibility, rendering a more proper model upgrading paradigm in large-scale retrieval systems.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (8)
  1. Binjie Zhang (7 papers)
  2. Shupeng Su (4 papers)
  3. Yixiao Ge (99 papers)
  4. Xuyuan Xu (10 papers)
  5. Yexin Wang (16 papers)
  6. Chun Yuan (127 papers)
  7. Mike Zheng Shou (165 papers)
  8. Ying Shan (252 papers)
Citations (3)

Summary

We haven't generated a summary for this paper yet.