TY - GEN
T1 - Knowledge Restore and Transfer for Multi-Label Class-Incremental Learning
AU - Dong, Songlin
AU - Luo, Haoyu
AU - He, Yuhang
AU - Wei, Xing
AU - Cheng, Jie
AU - Gong, Yihong
N1 - Publisher Copyright:
© 2023 IEEE.
PY - 2023
Y1 - 2023
N2 - Current class-incremental learning research mainly focuses on single-label classification tasks while multi-label class-incremental learning (MLCIL) with more practical application scenarios is rarely studied. Although there have been many anti-forgetting methods to solve the problem of catastrophic forgetting in single-label class-incremental learning, these methods have difficulty in solving the MLCIL problem due to label absence and information dilution problems. To solve these problems, we propose a Knowledge Restore and Transfer (KRT) framework containing two key components. First, a dynamic pseudo-label (DPL) module is proposed to solve the label absence problem by restoring the knowledge of old classes to the new data. Second, an incremental cross-attention (ICA) module is designed to maintain and transfer the old knowledge to solve the information dilution problem. Comprehensive experimental results on MS-COCO and PASCAL VOC datasets demonstrate the effectiveness of our method for improving recognition performance and mitigating forgetting on multi-label class-incremental learning tasks. The source code is available at https://gith.ub.com/witdsl/KRT-MLCIL.
AB - Current class-incremental learning research mainly focuses on single-label classification tasks while multi-label class-incremental learning (MLCIL) with more practical application scenarios is rarely studied. Although there have been many anti-forgetting methods to solve the problem of catastrophic forgetting in single-label class-incremental learning, these methods have difficulty in solving the MLCIL problem due to label absence and information dilution problems. To solve these problems, we propose a Knowledge Restore and Transfer (KRT) framework containing two key components. First, a dynamic pseudo-label (DPL) module is proposed to solve the label absence problem by restoring the knowledge of old classes to the new data. Second, an incremental cross-attention (ICA) module is designed to maintain and transfer the old knowledge to solve the information dilution problem. Comprehensive experimental results on MS-COCO and PASCAL VOC datasets demonstrate the effectiveness of our method for improving recognition performance and mitigating forgetting on multi-label class-incremental learning tasks. The source code is available at https://gith.ub.com/witdsl/KRT-MLCIL.
UR - https://www.scopus.com/pages/publications/85188234366
U2 - 10.1109/ICCV51070.2023.01715
DO - 10.1109/ICCV51070.2023.01715
M3 - 会议稿件
AN - SCOPUS:85188234366
T3 - Proceedings of the IEEE International Conference on Computer Vision
SP - 18665
EP - 18674
BT - Proceedings - 2023 IEEE/CVF International Conference on Computer Vision, ICCV 2023
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2023 IEEE/CVF International Conference on Computer Vision, ICCV 2023
Y2 - 2 October 2023 through 6 October 2023
ER -