Heterogeneous Mutual Knowledge Distillation for Wearable Human Activity Recognition

  • Zhiwen Xiao
  • , Huanlai Xing
  • , Rong Qu
  • , Hui Li
  • , Xinzhou Cheng
  • , Lexi Xu
  • , Li Feng
  • , Qian Wan

Research output: Contribution to journalArticlepeer-review

33 Scopus citations

Abstract

Recently, numerous deep learning algorithms have addressed wearable human activity recognition (HAR), but they often struggle with efficient knowledge transfer to lightweight models for mobile devices. Knowledge distillation (KD) is a popular technique for model compression, transferring knowledge from a complex teacher to a compact student. Most existing KD algorithms consider homogeneous architectures, hindering performance in heterogeneous setups. This is an under-explored area in wearable HAR. To bridge this gap, we propose a heterogeneous mutual KD (HMKD) framework for wearable HAR. HMKD establishes mutual learning within the intermediate and output layers of both teacher and student models. To accommodate substantial structural differences between teacher and student, we employ a weighted ensemble feature approach to merge the features from their intermediate layers, enhancing knowledge exchange within them. Experimental results on the HAPT, WISDM, and UCI_HAR datasets show HMKD outperforms ten state-of-the-art KD algorithms in terms of classification accuracy. Notably, with ResNetLSTMaN as the teacher and MLP as the student, HMKD increases by 9.19% in MLP's F1 score on the HAPT dataset.

Original languageEnglish
Pages (from-to)16589-16603
Number of pages15
JournalIEEE Transactions on Neural Networks and Learning Systems
Volume36
Issue number9
DOIs
StatePublished - 2025

Keywords

  • Data mining
  • human activity recognition (HAR)
  • knowledge distillation (KD)
  • model compression
  • wearable sensors

Fingerprint

Dive into the research topics of 'Heterogeneous Mutual Knowledge Distillation for Wearable Human Activity Recognition'. Together they form a unique fingerprint.

Cite this