Learning from open-set noisy labels based on multi-prototype modeling

  • Yue Zhang
  • , Yiyi Chen
  • , Chaowei Fang
  • , Qian Wang
  • , Jiayi Wu
  • , Jingmin Xin

Research output: Contribution to journalArticlepeer-review

10 Scopus citations

Abstract

In this paper, we propose a novel method to address the challenge of learning deep neural network models in the presence of open-set noisy labels, which include mislabeled samples from out-of-distribution categories. Previous methods relied on the distances between sample-wise predictions and labels to identify mislabeled samples and distinguish between in-distribution (ID) and out-of-distribution (OOD) noisy samples, which struggle to promptly identify the two types of noisy samples. To overcome these limitations, we propose a novel method that utilizes feature information and cross-instance relationships, enabling a more comprehensive distinction between ID and OOD noisy samples. Our approach involves a multi-prototype modeling mechanism, where each class is represented by multiple prototypes to account for the diversity within categories. This mechanism helps in distinguishing in-distribution and out-of-distribution noisy samples by comparing sample features with class prototypes. We introduce an online algorithm for updating prototypes and enhancing model optimization with cross-augmentation consistency and a noise-robust contrastive siamese learning technique. Our extensive experiments on datasets like CIFAR100, Clothing1M, and Food101N show our method's superiority in handling noisy labels compared to existing approaches. The code will be available at https://github.com/daisyarg/LNL-MPM.

Original languageEnglish
Article number110902
JournalPattern Recognition
Volume157
DOIs
StatePublished - Jan 2025

Keywords

  • Deep learning
  • Nosiy label
  • Out-of-distribution
  • Prototype learning

Fingerprint

Dive into the research topics of 'Learning from open-set noisy labels based on multi-prototype modeling'. Together they form a unique fingerprint.

Cite this