TY - JOUR
T1 - Analogical Learning-Based Few-Shot Class-Incremental Learning
AU - Li, Jiashuo
AU - Dong, Songlin
AU - Gong, Yihong
AU - He, Yuhang
AU - Wei, Xing
N1 - Publisher Copyright:
© 1991-2012 IEEE.
PY - 2024
Y1 - 2024
N2 - FSCIL (Few-shot class-incremental learning) is a prominent research topic in the ML community. It faces two significant challenges: forgetting old class knowledge and overfitting to limited new class training examples. In this paper, we present a novel FSCIL approach inspired by the human brain's analogical learning mechanism, which enables human beings to form knowledge about a target domain from the knowledge of the source domains that are analogical to the target in some aspects. The proposed analogical learning-based FSCIL (ALFSCIL) method consists of two major components: new class classifier constructor (NCCC) and Meta-Analogical training (MAT). The NCCC module utilizes a multi-head cross-attention transformer to compute analogies between new and old classes, generating new class classifiers by blending old class classifiers based on the computed analogies. The MAT module updates the parameters of the CNN feature extractor, the NCCC module, and the knowledge for each encountered class after each round of the FSCIL session. We turn the optimization process into a bi-level optimization problem (BOP) whose theoretical analysis proves the stability and plasticity of our proposed model. Experimental evaluations reveal that this proposed ALFSCIL method achieves the SOTA performance accuracies on three benchmark datasets: CIFAR100, miniImageNet, and CUB200.
AB - FSCIL (Few-shot class-incremental learning) is a prominent research topic in the ML community. It faces two significant challenges: forgetting old class knowledge and overfitting to limited new class training examples. In this paper, we present a novel FSCIL approach inspired by the human brain's analogical learning mechanism, which enables human beings to form knowledge about a target domain from the knowledge of the source domains that are analogical to the target in some aspects. The proposed analogical learning-based FSCIL (ALFSCIL) method consists of two major components: new class classifier constructor (NCCC) and Meta-Analogical training (MAT). The NCCC module utilizes a multi-head cross-attention transformer to compute analogies between new and old classes, generating new class classifiers by blending old class classifiers based on the computed analogies. The MAT module updates the parameters of the CNN feature extractor, the NCCC module, and the knowledge for each encountered class after each round of the FSCIL session. We turn the optimization process into a bi-level optimization problem (BOP) whose theoretical analysis proves the stability and plasticity of our proposed model. Experimental evaluations reveal that this proposed ALFSCIL method achieves the SOTA performance accuracies on three benchmark datasets: CIFAR100, miniImageNet, and CUB200.
KW - Few-shot class-incremental learning
KW - analogical learning
KW - class classifier constructor
KW - meta-analogical training
KW - transformer
UR - https://www.scopus.com/pages/publications/85182371666
U2 - 10.1109/TCSVT.2024.3350913
DO - 10.1109/TCSVT.2024.3350913
M3 - 文章
AN - SCOPUS:85182371666
SN - 1051-8215
VL - 34
SP - 5493
EP - 5504
JO - IEEE Transactions on Circuits and Systems for Video Technology
JF - IEEE Transactions on Circuits and Systems for Video Technology
IS - 7
ER -