An Incremental Learning Method With Feature-Attention Distillation and Logit Adjustment for Rotating Machinery Fault Diagnosis

  • Yasong Li
  • , Hong Xu
  • , Yuangui Yang
  • , Chenye Hu
  • , Chuang Sun
  • , Huimin Song
  • , Laihao Yang

Research output: Contribution to journalArticlepeer-review

Abstract

Deep learning (DL)-based diagnosis methods can accurately identify the fault mode, which have attracted widespread attention from researchers. For mechanical systems, in actual industrial environments, data from different fault modes will continue to emerge, which requires the model to be updated in a timely manner and maintain high diagnosis accuracy. Class incremental learning (CIL) is proposed for classification problems with a continuously increasing number of modes, which meets the requirement of industrial diagnosis. However, directly incorporating new class data into the training set to optimize the network will cause the model to forget old class knowledge, resulting in irreversible performance degradation, known as catastrophic forgetting. To solve this problem, this article constructs an incremental learning method with feature-attention distillation and logit adjustment (ILD-FADLA) for fault diagnosis. Specifically, a residual network composed of convolutional blocks is used as the feature extractor, and channel attention and spatial attention are superimposed in each residual block to enhance the feature extraction capability. To alleviate catastrophic forgetting, the proposed ILD-FADLA distills the attention weights at each layer and feature relationship information before classifier. In addition, logit adjustment cross-entropy (LACE) loss is used to mitigate the bias of the classifier toward new classes. Experimental results on two private datasets show that the proposed ILD-FADLA improves the average accuracy of the incremental phase by 17.76% and 22.42% over the baseline method.

Original languageEnglish
Article number3547213
JournalIEEE Transactions on Instrumentation and Measurement
Volume74
DOIs
StatePublished - 2025

Keywords

  • Attention distillation
  • imbalance fault diagnosis
  • incremental learning

Fingerprint

Dive into the research topics of 'An Incremental Learning Method With Feature-Attention Distillation and Logit Adjustment for Rotating Machinery Fault Diagnosis'. Together they form a unique fingerprint.

Cite this