跳到主要导航 跳到搜索 跳到主要内容

Complementary Relation Contrastive Distillation

科研成果: 书/报告/会议事项章节会议稿件同行评审

108 引用 (Scopus)

摘要

Knowledge distillation aims to transfer representation ability from a teacher model to a student model. Previous approaches focus on either individual representation distillation or inter-sample similarity preservation. While we argue that the inter-sample relation conveys abundant information and needs to be distilled in a more effective way. In this paper, we propose a novel knowledge distillation method, namely Complementary Relation Contrastive Distillation (CRCD), to transfer the structural knowledge from the teacher to the student. Specifically, we estimate the mutual relation in an anchor-based way and distill the anchor-student relation under the supervision of its corresponding anchor-teacher relation. To make it more robust, mutual relations are modeled by two complementary elements: the feature and its gradient. Furthermore, the low bound of mutual information between the anchor-teacher relation distribution and the anchor-student relation distribution is maximized via relation contrastive loss, which can distill both the sample representation and the inter-sample relations. Experiments on different benchmarks demonstrate the effectiveness of our proposed CRCD.

源语言英语
主期刊名Proceedings - 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2021
出版商IEEE Computer Society
9256-9265
页数10
ISBN(电子版)9781665445092
DOI
出版状态已出版 - 2021
活动2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2021 - Virtual, Online, 美国
期限: 19 6月 202125 6月 2021

出版系列

姓名Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition
ISSN(印刷版)1063-6919

会议

会议2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2021
国家/地区美国
Virtual, Online
时期19/06/2125/06/21

学术指纹

探究 'Complementary Relation Contrastive Distillation' 的科研主题。它们共同构成独一无二的指纹。

引用此