Skip to main navigation Skip to search Skip to main content

Efficient Person Search via Expert-Guided Knowledge Distillation

Research output: Contribution to journalArticlepeer-review

19 Scopus citations

Abstract

The person search problem aims to find the target person in the scene images, which presents high demands for both effectiveness and efficiency. In this paper, we present a unified person search framework which jointly handles the two demands for real-world applications. We explore the technique of knowledge distillation (KD), which allows the student network to share capabilities of the deep expert networks with much fewer parameters and less computing time. To achieve this, we describe an efficient person search network and a set of deep and well-engineered expert networks, to build a tiny and compact model that can approximate the representations of the expert networks in a multitask learning manner. We present extensive experiments on three customized student networks with different scales of networks and show strong performance compared to the state-of-the-art methods on both mean average precision and top-1 accuracies. We further demonstrate the efficiency of the proposed network at 120 frames/s in the feedforward time with only a little sacrifice on the accuracy.

Original languageEnglish
Pages (from-to)5093-5104
Number of pages12
JournalIEEE Transactions on Cybernetics
Volume51
Issue number10
DOIs
StatePublished - Oct 1 2021

Keywords

  • Knowledge distillation
  • multi-task learning
  • person search

Fingerprint

Dive into the research topics of 'Efficient Person Search via Expert-Guided Knowledge Distillation'. Together they form a unique fingerprint.

Cite this