论文阅读
Golden-sun
这个作者很懒,什么都没留下…
展开
-
模型压缩 相关文章解读
模型压缩相关文章Learning both Weights and Connections for Efficient Neural Networks (NIPS2015)Deep Compression: Compressing Deep Neural Networks with Pruning, Trained Quantization and Huffman Coding(ICLR2016)Learning both Weights and Connections for Efficient Neu原创 2020-10-12 21:18:37 · 97 阅读 · 0 评论 -
知识蒸馏 knowledge distill 相关论文理解
Knowledge Distil 相关文章1.A Gift from Knowledge Distillation:Fast Optimization, Network Minimization and Transfer Learning (CVPR 2017)1.A Gift from Knowledge Distillation:Fast Optimization, Network Minimization and Transfer Learning (CVPR 2017)主要贡献:1.提出了一种原创 2020-10-12 21:18:13 · 1752 阅读 · 0 评论