Skip to main navigation Skip to search Skip to main content

Exploring the Potential of Class-Specific Encoding for Few-Shot Object Detection

  • Wei Zhou
  • , Zongling Li
  • , Shu Hu
  • , Shanmin Yang
  • , Tao Wu
  • , Siwei Lyu
  • , Ying Fu
  • , Xi Wu
  • , Xin Wang
  • Chengdu University of Information Technology
  • Purdue University
  • SUNY Buffalo
  • SUNY Albany

Research output: Contribution to journalArticlepeer-review

Abstract

Recently, the object detection methods based on few shot learning have significantly advanced the field of object detection, because the methods are easy to achieve excellent detection results when there are few training samples. However, detecting novel classes with existing methods often requires time-consuming model retraining. Moreover, the detection performance of new models for base classes may decrease. In order to solve the above problems, this paper proposes a new few-shot object detection model, which can efficiently detect the novel classes introduced without fine-tuning the model. This model enables simultaneous detection of both novel and base classes, effectively mitigating catastrophic forgetting. The model incorporates a base class detector, augmented with an additional contrastive branch to extract class representation information. This decouples object localization and classification, leading to a marked improvement in the model’s generalization performance for novel classes. In addition, we investigate the effectiveness of self-supervised and supervised contrastive losses for class-specific encodings in our framework.

Original languageEnglish
Pages (from-to)1441-1453
Number of pages13
JournalIEEE Transactions on Emerging Topics in Computational Intelligence
Volume10
Issue number2
DOIs
StatePublished - Apr 1 2026

Keywords

  • Object detection
  • class-specific encoding
  • contrastive learning
  • few-shot object detector

Fingerprint

Dive into the research topics of 'Exploring the Potential of Class-Specific Encoding for Few-Shot Object Detection'. Together they form a unique fingerprint.

Cite this