TY - GEN
T1 - Few-Shot Class-Incremental Learning via Relation Knowledge Distillation
AU - Dong, Songlin
AU - Hong, Xiaopeng
AU - Tao, Xiaoyu
AU - Chang, Xinyuan
AU - Wei, Xing
AU - Gong, Yihong
N1 - Publisher Copyright:
Copyright © 2021, Association for the Advancement of Artificial Intelligence (www.aaai.org). All rights reserved
PY - 2021
Y1 - 2021
N2 - In this paper, we focus on the challenging few-shot class-incremental learning (FSCIL) problem, which requires to transfer knowledge from old tasks to new ones and solves catastrophic forgetting. We propose the exemplar relation distillation incremental learning framework to balance the tasks of old-knowledge preserving and new-knowledge adaptation. First, we construct an exemplar relation graph to represent the knowledge learned by the original network and update gradually for new tasks learning. Then an exemplar relation loss function for discovering the relation knowledge between different classes is introduced to learn and transfer the structural information in relation graph. A large number of experiments demonstrate that relation knowledge does exist in the exemplars and our approach outperforms other state-ofthe-art class-incremental learning methods on the CIFAR100, miniImageNet, and CUB200 datasets.
AB - In this paper, we focus on the challenging few-shot class-incremental learning (FSCIL) problem, which requires to transfer knowledge from old tasks to new ones and solves catastrophic forgetting. We propose the exemplar relation distillation incremental learning framework to balance the tasks of old-knowledge preserving and new-knowledge adaptation. First, we construct an exemplar relation graph to represent the knowledge learned by the original network and update gradually for new tasks learning. Then an exemplar relation loss function for discovering the relation knowledge between different classes is introduced to learn and transfer the structural information in relation graph. A large number of experiments demonstrate that relation knowledge does exist in the exemplars and our approach outperforms other state-ofthe-art class-incremental learning methods on the CIFAR100, miniImageNet, and CUB200 datasets.
UR - https://www.scopus.com/pages/publications/85119577520
U2 - 10.1609/aaai.v35i2.16213
DO - 10.1609/aaai.v35i2.16213
M3 - 会议稿件
AN - SCOPUS:85119577520
T3 - 35th AAAI Conference on Artificial Intelligence, AAAI 2021
SP - 1256
EP - 1263
BT - 35th AAAI Conference on Artificial Intelligence, AAAI 2021
PB - Association for the Advancement of Artificial Intelligence
T2 - 35th AAAI Conference on Artificial Intelligence, AAAI 2021
Y2 - 2 February 2021 through 9 February 2021
ER -