TY - JOUR
T1 - Learning to complement
T2 - Relation complementation network for few-shot class-incremental learning
AU - Wang, Ye
AU - Wang, Yaxiong
AU - Zhao, Guoshuai
AU - Qian, Xueming
N1 - Publisher Copyright:
© 2023 Elsevier B.V.
PY - 2023/12/20
Y1 - 2023/12/20
N2 - Real-world industrial scenarios pose a challenging task known as few-shot class-incremental learning (FSCIL), which aims to recognize new classes using a few samples while not forgetting the old classes. Despite the recent advance of FSCIL, most existing methods rely on a single metric for making incremental relation predictions, which is unilateral and lacks stability. In this paper, we remedy this issue from two aspects. Specifically, to make convincing relation predictions, we first propose a relation complementation strategy that aggregates different metric models to investigate the comprehensive relation of classifier weights and test features. Then, to make the proposed strategy well fit the incremental scenarios, we design a pseudo incremental relation complementation learning scheme that constructs the learning tasks by mimicking the data setting in real incremental sessions. Taken together, our proposed method dubbed Relation Complementation Network (RCN) achieves the state-of-the-art performance on miniImageNet, CIFAR100 and CUB200. Our code is available at https://github.com/YeZiLaiXi/KT-RCN.git.
AB - Real-world industrial scenarios pose a challenging task known as few-shot class-incremental learning (FSCIL), which aims to recognize new classes using a few samples while not forgetting the old classes. Despite the recent advance of FSCIL, most existing methods rely on a single metric for making incremental relation predictions, which is unilateral and lacks stability. In this paper, we remedy this issue from two aspects. Specifically, to make convincing relation predictions, we first propose a relation complementation strategy that aggregates different metric models to investigate the comprehensive relation of classifier weights and test features. Then, to make the proposed strategy well fit the incremental scenarios, we design a pseudo incremental relation complementation learning scheme that constructs the learning tasks by mimicking the data setting in real incremental sessions. Taken together, our proposed method dubbed Relation Complementation Network (RCN) achieves the state-of-the-art performance on miniImageNet, CIFAR100 and CUB200. Our code is available at https://github.com/YeZiLaiXi/KT-RCN.git.
KW - Class-incremental learning
KW - Few-shot class-incremental learning
KW - Image recognition
KW - Lifelong learning
UR - https://www.scopus.com/pages/publications/85175524076
U2 - 10.1016/j.knosys.2023.111130
DO - 10.1016/j.knosys.2023.111130
M3 - 文章
AN - SCOPUS:85175524076
SN - 0950-7051
VL - 282
JO - Knowledge-Based Systems
JF - Knowledge-Based Systems
M1 - 111130
ER -