Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Localized Learning: Decentralized Model Updates via Non-Global Objectives

Metric Compatible Training for Online Backfilling in Large-Scale Retrieval

Seonguk Seo · Mustafa Gokhan Uzunbas · Bohyung Han · Xuefei Cao · Joena Zhang · Taipeng Tian · Ser Nam Lim

Keywords: [ large-scale image retrieval ] [ compatibility ] [ model upgrade ] [ backward compatible training ]


Abstract:

In large-scale retrieval systems, model upgrades require backfilling, which is the process of re-extracting all gallery embeddings from upgraded models. However, it inevitably spends a prohibitively large amount of computational cost and even entails the downtime of the service. To alleviate this bottleneck, backward-compatible learning is proposed to learn feature space of new model while being compatible with those of old model. Although it sidesteps this challenge by tackling query-side representations, this leads to suboptimal solutions in principle because gallery embeddings cannot benefit from model upgrades. We address this dilemma by introducing an online backfilling algorithm, which enables us to achieve a progressive performance improvement during the backfilling process without sacrificing the full performance of the new model after the completion of backfilling. To this end, we first show that a simple distance rank merge is a reasonable option for online backfilling. Then, we incorporate a reverse transformation module and metric-compatible contrastive learning, resulting in desirable merge results during backfilling with no extra overhead. Extensive experiments show the benefit of our framework on four standard benchmarks in various settings.

Chat is not available.