Abstract
Anomaly detection is of great significance to ensure operational safety of advanced equipment. However, existing anomaly detection algorithms presume that the working conditions are invariable, which cannot adapt to actual industrial scenarios and may cause false or missing alarms when the working conditions change. To address this issue, we expect to learn a condition-independent feature space, from which anomaly indicators can be derived to enable condition-agnostic anomaly detection. The objective to find such space is then deduced intuitively and justified by information theory. Correspondingly, a Mutual Information-based Feature Disentangled Network (MIFD-Net) is proposed for anomaly detection under variable working conditions. Specifically, to ensure feature independence from working conditions, the mutual information between latent representations and conditions is minimized by reducing its variational upper bound. Meanwhile, a self-supervised contrastive loss is deduced to maximize the mutual information between latent features and observed signals, which improves the fault discriminability of latent features. Finally, both feature constraints are integrated via dynamic weights in model training. Reconstruction errors derived from latent features are defined as anomaly scores for condition-agnostic anomaly detection. Experiments on part-level and component-level fault datasets demonstrate the superiority of the proposed method under both multiple and varying working condition scenarios.
| Original language | English |
|---|---|
| Article number | 110804 |
| Journal | Mechanical Systems and Signal Processing |
| Volume | 204 |
| DOIs | |
| State | Published - 1 Dec 2023 |
Keywords
- Anomaly detection
- Contrastive learning
- Deep learning
- Feature disentanglement
- Mutual information
- Variable working conditions
Fingerprint
Dive into the research topics of 'Mutual information-based feature disentangled network for anomaly detection under variable working conditions'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver