CEAT: Continual Expansion and Absorption Transformer for Non-Exemplar Class-Incremental Learning

  • Songlin Dong
  • , Xinyuan Gao
  • , Yuhang He
  • , Zhengdong Zhou
  • , Alex C. Kot
  • , Yihong Gong

Research output: Contribution to journalArticlepeer-review

5 Scopus citations

Abstract

In dynamic real-world scenarios, continuous learning without forgetting old knowledge is essential, particularly in environments with stricter privacy protection or resource-constrained edge devices where storing old exemplars is infeasible. Therefore, Non-Exemplar Class-Incremental Learning (NECIL) has garnered significant attention. Compared with normal settings, it faces a more severe plasticity-stability dilemma and classifier bias. To address those challenges, we propose a framework based on the vision transformer architecture, called the Continual Expansion and Absorption Transformer (CEAT), which consists of two core components. First, we propose the Continual Expansion and Absorption (CEA) method to alleviate the trade-off between new and old classes by parallelly expanding a set of parameters (i.e. EF layer) on the backbone to learn new tasks, while freezing the backbone to retain old task knowledge. The EF layers can be seamlessly absorbed into the ViT backbone through parameter recombination before inference, mitigating storage and computational burdens. Second, we propose a Dynamic Boundary-Aware (DBA) method to generate dynamic pseudo-features for classifier calibration to address the classifier bias. Extensive experiments demonstrate that our approach achieves state-of-the-art performance, particularly showcasing significant improvements of 4.82% and 5.92% on TinyImageNet and ImageNet-Subset, respectively.

Original languageEnglish
Pages (from-to)3146-3159
Number of pages14
JournalIEEE Transactions on Circuits and Systems for Video Technology
Volume35
Issue number4
DOIs
StatePublished - 2025

Keywords

  • Class-incremental learning
  • continual expansion and absorption
  • dynamic boundary-aware
  • non-exemplar

Fingerprint

Dive into the research topics of 'CEAT: Continual Expansion and Absorption Transformer for Non-Exemplar Class-Incremental Learning'. Together they form a unique fingerprint.

Cite this