- 博客(7)
- 资源 (2)
- 收藏
- 关注
原创 2022(CVPR)InDistill: Transferring Knowledge From Pruned Intermediate Layers
2022(CVPR)InDistill: Transferring Knowledge From Pruned Intermediate Layers
2023-01-20 15:56:00
117
原创 2022 (CVPR) HRel: Filter Pruning based on High Relevance between Activation Maps and Class Labels
2022 (CVPR) HRel: Filter Pruning based on High Relevance between Activation Maps and Class Labels
2023-01-09 16:02:17
105
原创 2022 (IEEE) Distilling a Powerful Student Model via Online Knowledge Distillation
2022 (IEEE) Distilling a Powerful Student Model via Online Knowledge Distillation
2022-12-28 15:51:19
485
原创 2021 (ICKD) Exploring Inter-Channel Correlation for Diversity-preserved Knowledge
2021 (ICKD) Exploring Inter-Channel Correlation for Diversity-preserved Knowledge
2022-12-18 16:03:24
544
原创 2022 (DKD) Decoupled Knowledge Distillation
Decoupled Knowledge Distillation
2022-12-17 17:23:17
353
原创 2020 (CSKD) Regularizing Class-wise Predictions via Self-knowledge Distillation
(CS-KD) Regularizing Class-wise Predictions via Self-knowledge Distillation
2022-12-17 16:14:33
259
原创 2015 (KD) Distilling the Knowledge in a Neural Network
Distilling the Knowledge in a Neural Network
2022-12-17 15:38:46
180
空空如也
TA创建的收藏夹 TA关注的收藏夹
TA关注的人