Deep Class-Incremental Learning From Decentralized Data

  • Xiaohan Zhang
  • , Songlin Dong
  • , Jinjie Chen
  • , Qi Tian
  • , Yihong Gong
  • , Xiaopeng Hong

Research output: Contribution to journalArticlepeer-review

11 Scopus citations

Abstract

In this article, we focus on a new and challenging decentralized machine learning paradigm in which there are continuous inflows of data to be addressed and the data are stored in multiple repositories. We initiate the study of data-decentralized class-incremental learning (DCIL) by making the following contributions. First, we formulate the DCIL problem and develop the experimental protocol. Second, we introduce a paradigm to create a basic decentralized counterpart of typical (centralized) CIL approaches, and as a result, establish a benchmark for the DCIL study. Third, we further propose a decentralized composite knowledge incremental distillation (DCID) framework to transfer knowledge from historical models and multiple local sites to the general model continually. DCID consists of three main components, namely, local CIL, collaborated knowledge distillation (KD) among local models, and aggregated KD from local models to the general one. We comprehensively investigate our DCID framework by using a different implementation of the three components. Extensive experimental results demonstrate the effectiveness of our DCID framework. The source code of the baseline methods and the proposed DCIL is available at https://github.com/Vision-Intelligence-and-Robots-Group/DCIL.

Original languageEnglish
Pages (from-to)7190-7203
Number of pages14
JournalIEEE Transactions on Neural Networks and Learning Systems
Volume35
Issue number5
DOIs
StatePublished - 1 May 2024

Keywords

  • Catastrophic forgetting
  • continuous learning
  • incremental learning (IL)
  • knowledge distillation (KD)

Fingerprint

Dive into the research topics of 'Deep Class-Incremental Learning From Decentralized Data'. Together they form a unique fingerprint.

Cite this