TY - GEN
T1 - DYSON
T2 - 2024 IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2024
AU - He, Yuhang
AU - Chen, Yingjie
AU - Jin, Yuhan
AU - Dong, Songlin
AU - Wei, Xing
AU - Gong, Yihong
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2024
Y1 - 2024
N2 - In this paper, we focus on a challenging Online Task-Free Class Incremental Learning (OTFCIL) problem. Dif-ferent from the existing methods that continuously learn the feature space from data streams, we propose a novel compute-and-align paradigm for the OTFCIL. It first com-putes an optimal geometry, i.e., the class prototype distri-bution, for classifying existing classes and updates it when new classes emerge, and then trains a DNN model by aligning its feature space to the optimal geometry. To this end, we develop a novel Dynamic Neural Collapse (DNC) algorithm to compute and update the optimal geometry. The DNC ex-pands the geometry when new classes emerge without loss of the geometry optimality and guarantees the drift distance of old class prototypes with an explicit upper bound. On this basis, we propose a novel DYnamic feature space Self-OrganizatioN (DYSON) method containing three ma-jor components, including 1) a feature extractor, 2) a Dy-namic Feature-Geometry Alignment (DFGA) module aligning the feature space to the optimal geometry computed by DNC and 3) a training-free class-incremental classifier de-rived from the DNC geometry. Experimental comparison results on four benchmark datasets, including CIFAR10, CI-FAR100, CUB200, and CoRe50, demonstrate the efficiency and superiority of the DYSON method. The source code is released at https://github.com/isCDX2IDYSON.
AB - In this paper, we focus on a challenging Online Task-Free Class Incremental Learning (OTFCIL) problem. Dif-ferent from the existing methods that continuously learn the feature space from data streams, we propose a novel compute-and-align paradigm for the OTFCIL. It first com-putes an optimal geometry, i.e., the class prototype distri-bution, for classifying existing classes and updates it when new classes emerge, and then trains a DNN model by aligning its feature space to the optimal geometry. To this end, we develop a novel Dynamic Neural Collapse (DNC) algorithm to compute and update the optimal geometry. The DNC ex-pands the geometry when new classes emerge without loss of the geometry optimality and guarantees the drift distance of old class prototypes with an explicit upper bound. On this basis, we propose a novel DYnamic feature space Self-OrganizatioN (DYSON) method containing three ma-jor components, including 1) a feature extractor, 2) a Dy-namic Feature-Geometry Alignment (DFGA) module aligning the feature space to the optimal geometry computed by DNC and 3) a training-free class-incremental classifier de-rived from the DNC geometry. Experimental comparison results on four benchmark datasets, including CIFAR10, CI-FAR100, CUB200, and CoRe50, demonstrate the efficiency and superiority of the DYSON method. The source code is released at https://github.com/isCDX2IDYSON.
KW - continual learning
KW - dynamic neural collapse
KW - feature space organization
UR - https://www.scopus.com/pages/publications/85207270055
U2 - 10.1109/CVPR52733.2024.02241
DO - 10.1109/CVPR52733.2024.02241
M3 - 会议稿件
AN - SCOPUS:85207270055
T3 - Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition
SP - 23741
EP - 23751
BT - Proceedings - 2024 IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2024
PB - IEEE Computer Society
Y2 - 16 June 2024 through 22 June 2024
ER -