Data-Efficient Ranking Distillation for Image Retrieval

Zakaria Laskar*, Juho Kannala

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionScientificpeer-review

Abstract

Recent advances in deep learning has lead to rapid developments in the field of image retrieval. However, the best performing architectures incur significant computational cost. The paper addresses this issue using knowledge distillation for metric learning problems. Unlike previous approaches, our proposed method jointly addresses the following constraints: i) limited queries to teacher model, ii) black box teacher model with access to the final output representation, and iii) small fraction of original training data without any ground-truth labels. In addition, the distillation method does not require the student and teacher to have same dimensionality. The key idea is to augment the original training set with additional samples by performing linear interpolation in the final output representation space. In low training sample settings, our approach outperforms the fully supervised baseline approach on ROxford5k and RParis6k with the least possible teacher supervision.

Original languageEnglish
Title of host publicationComputer Vision – ACCV 2020 - 15th Asian Conference on Computer Vision, 2020, Revised Selected Papers
EditorsHiroshi Ishikawa, Cheng-Lin Liu, Tomas Pajdla, Jianbo Shi
Pages469-484
Number of pages16
DOIs
Publication statusPublished - 2021
MoE publication typeA4 Article in a conference publication
EventAsian Conference on Computer Vision - Virtual, Online
Duration: 30 Nov 20204 Dec 2020
Conference number: 15

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume12622 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

ConferenceAsian Conference on Computer Vision
Abbreviated titleACCV
CityVirtual, Online
Period30/11/202004/12/2020

Fingerprint

Dive into the research topics of 'Data-Efficient Ranking Distillation for Image Retrieval'. Together they form a unique fingerprint.

Cite this