Lightweight Object Detection-Tracking using Deep Feature Distillation

  • Bin Xue
  • , Qinghua Zheng
  • , Zhinan Li
  • , Weihu Zhao
  • , Heling Wang
  • , Xue Feng

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Object detection and tracking are critical and fundamental problems in machine vision task. In this paper, an object detection and tracking method is proposed based on deep feature distillation. Particularly, an adaptive unsupervised Teacher-Student unified framework is developed. The Teacher module is performed by an expandable generative adversarial network mixture model. And knowledge discrepancy ranking (KDR) is designed to optimize Teacher resource allocation with the historical underlying knowledge. The Student module is developed based on a lightweight probabilistic generative model. And an unsupervised learning scheme is presented based on Gumbel-Soft sampling optimization to train jointly. A series of experiments are performed on authoritative dataset, demonstrating that the proposed method outperforms the state-of-the-art comparison methods.

Original languageEnglish
Title of host publicationProceedings of the 2024 16th International Conference on Machine Learning and Computing, ICMLC 2024
PublisherAssociation for Computing Machinery
Pages287-291
Number of pages5
ISBN (Electronic)9798400709234
DOIs
StatePublished - 2 Feb 2024
Event16th International Conference on Machine Learning and Computing, ICMLC 2024 - Shenzhen, China
Duration: 2 Feb 20245 Feb 2024

Publication series

NameACM International Conference Proceeding Series

Conference

Conference16th International Conference on Machine Learning and Computing, ICMLC 2024
Country/TerritoryChina
CityShenzhen
Period2/02/245/02/24

Keywords

  • Object detection
  • deep learning
  • knowledge distillation
  • object tracking

Fingerprint

Dive into the research topics of 'Lightweight Object Detection-Tracking using Deep Feature Distillation'. Together they form a unique fingerprint.

Cite this