LLM-Enhanced Multi-Teacher Knowledge Distillation for Modality-Incomplete Emotion Recognition in Daily Healthcare

Research output: Contribution to journalArticlepeer-review

10 Scopus citations

Abstract

The critical importance of monitoring and recognizing human emotional states in healthcare has led to a surge in proposals for EEG-based multimodal emotion recognition in recent years. However, practical challenges arise in acquiring EEG signals in daily healthcare settings due to stringent data acquisition conditions, resulting in the issue of incomplete modalities. Existing studies have turned to knowledge distillation as a means to mitigate this problem by transferring knowledge from multimodal networks to unimodal ones. However, these methods are constrained by the use of a single teacher model to transfer integrated feature extraction knowledge, particularly concerning spatial and temporal features in EEG data. To address this limitation, we propose a multi-teacher knowledge distillation framework enhanced with a Large Language Model (LLM), aimed at facilitating effective feature learning in the student network by transferring knowledge of extracting integrated features. Specifically, we employ an LLM as the teacher for extracting temporal features and a graph convolutional neural network for extracting spatial features. To further enhance knowledge distillation, we introduce causal masking and a confidence indicator into the LLM to facilitate the transfer of the most discriminative features. Extensive testing on the DEAP and MAHNOB-HCI datasets demonstrates that our model outperforms existing methods in the modality-incomplete scenario. This study underscores the potential application of large models in this field.

Original languageEnglish
Pages (from-to)6406-6416
Number of pages11
JournalIEEE Journal of Biomedical and Health Informatics
Volume29
Issue number9
DOIs
StatePublished - 2025

UN SDGs

This output contributes to the following UN Sustainable Development Goals (SDGs)

  1. SDG 3 - Good Health and Well-being
    SDG 3 Good Health and Well-being

Keywords

  • Healthcare
  • emotion recognition
  • large language model
  • multi-teacher knowledge distillation

Fingerprint

Dive into the research topics of 'LLM-Enhanced Multi-Teacher Knowledge Distillation for Modality-Incomplete Emotion Recognition in Daily Healthcare'. Together they form a unique fingerprint.

Cite this