Skip to main navigation Skip to search Skip to main content

PROGRESSIVE KNOWLEDGE DISTILLATION FOR EARLY ACTION RECOGNITION

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

11 Scopus citations

Abstract

We present a novel framework to train a recurrent neural network for early recognition of human actions, which is an important but challenging task given the need to recognize an on-going action based on partial observation. Our framework is based on knowledge distillation, where the network for early recognition is viewed as a student model. The student is trained using knowledge distilled from a more knowledgeable teacher model that can peek into the future and incorporate extra observations about the action in consideration. This framework can be used in both supervised and semi-supervised learning settings, being able to utilize both the labeled and unlabeled training data. Experiments on the UCF101, SYSU 3DHOI, and NTU RGB-D datasets show the effectiveness of knowledge distillation for early recognition, including when we only have a small amount of annotated training data.

Original languageEnglish
Title of host publication2021 IEEE International Conference on Image Processing, ICIP 2021 - Proceedings
PublisherIEEE Computer Society
Pages2583-2587
Number of pages5
ISBN (Electronic)9781665441155
DOIs
StatePublished - 2021
Event28th IEEE International Conference on Image Processing, ICIP 2021 - Anchorage, United States
Duration: Sep 19 2021Sep 22 2021

Publication series

NameProceedings - International Conference on Image Processing, ICIP
Volume2021-September

Conference

Conference28th IEEE International Conference on Image Processing, ICIP 2021
Country/TerritoryUnited States
CityAnchorage
Period09/19/2109/22/21

Fingerprint

Dive into the research topics of 'PROGRESSIVE KNOWLEDGE DISTILLATION FOR EARLY ACTION RECOGNITION'. Together they form a unique fingerprint.

Cite this