TY - JOUR
T1 - FCHP
T2 - Exploring the Discriminative Feature and Feature Correlation of Feature Maps for Hierarchical DNN Pruning and Compression
AU - Zhang, Haonan
AU - Liu, Longjun
AU - Zhou, Hengyi
AU - Si, Liang
AU - Sun, Hongbin
AU - Zheng, Nanning
N1 - Publisher Copyright:
© 1991-2012 IEEE.
PY - 2022/10/1
Y1 - 2022/10/1
N2 - Pruning can remove the redundant parameters and structures of Deep Neural Networks (DNNs) to reduce inference time and memory overhead. As one of the important components of DNN, feature maps (FMs) have been widely used in network pruning. However, previous approaches do not fully investigate the discriminative features in FMs, and also do not explicitly utilize all the features associated with each layer in the pruning procedure. In this paper, we explore the discriminative feature of FMs and explicitly investigate the two-adjacent-layer features of each layer to propose a three-phase hierarchical pruning framework, dubbed as FCHP. Firstly, we decompose each FM into several components to extract the discriminative feature. After that, since pruning each layer is related to the FMs of adjacent layers, we explicitly calculate the feature correlation of discriminative features of two adjacent layers, and then use the feature correlation to cluster FMs into several hierarchies to guide subsequent pruning. Finally, we compute the content of discriminative features, and remove channels corresponding to FMs with fewer discriminative features in each hierarchy, respectively. In the experiment, we prune DNNs with the multiple types of architecture on different benchmarks, and the results have achieved the state-of-the-arts in terms of compressed parameters and FLOPs drop. For example, as for ResNet-56 on CIFAR-10, FCHP respectively obtains 50% of parameters and FLOPs reduction with negligible accuracy loss. Besides, as for ResNet-50 on ImageNet, FCHP reduces 40.5% of parameters and 44.1% of FLOPs with 0.43% of Top-1 accuracy drop.
AB - Pruning can remove the redundant parameters and structures of Deep Neural Networks (DNNs) to reduce inference time and memory overhead. As one of the important components of DNN, feature maps (FMs) have been widely used in network pruning. However, previous approaches do not fully investigate the discriminative features in FMs, and also do not explicitly utilize all the features associated with each layer in the pruning procedure. In this paper, we explore the discriminative feature of FMs and explicitly investigate the two-adjacent-layer features of each layer to propose a three-phase hierarchical pruning framework, dubbed as FCHP. Firstly, we decompose each FM into several components to extract the discriminative feature. After that, since pruning each layer is related to the FMs of adjacent layers, we explicitly calculate the feature correlation of discriminative features of two adjacent layers, and then use the feature correlation to cluster FMs into several hierarchies to guide subsequent pruning. Finally, we compute the content of discriminative features, and remove channels corresponding to FMs with fewer discriminative features in each hierarchy, respectively. In the experiment, we prune DNNs with the multiple types of architecture on different benchmarks, and the results have achieved the state-of-the-arts in terms of compressed parameters and FLOPs drop. For example, as for ResNet-56 on CIFAR-10, FCHP respectively obtains 50% of parameters and FLOPs reduction with negligible accuracy loss. Besides, as for ResNet-50 on ImageNet, FCHP reduces 40.5% of parameters and 44.1% of FLOPs with 0.43% of Top-1 accuracy drop.
KW - Deep neural network
KW - discriminative feature
KW - feature correlation
KW - feature map
KW - network pruning
UR - https://www.scopus.com/pages/publications/85129347030
U2 - 10.1109/TCSVT.2022.3170620
DO - 10.1109/TCSVT.2022.3170620
M3 - 文章
AN - SCOPUS:85129347030
SN - 1051-8215
VL - 32
SP - 6807
EP - 6820
JO - IEEE Transactions on Circuits and Systems for Video Technology
JF - IEEE Transactions on Circuits and Systems for Video Technology
IS - 10
ER -