Paper Reading: Slimmable Neural Networks

Slimmable Neural Networks: Slimmable Neural Networks

1 What is done

  1. Presents a way to apply a fixed model to different situations where various calculation capabilities are provided.
  2. A method called switchable batch normalization is applied in order to train the model successfully.
  3. Experiments that proves this method works properly is demostrated.

2 How it works

2.1 Slimmable network

2.1.1 The idea

The general idea is rather simple. Since we would like to apply this model for different calculation capabilities, why not simply drop out some of the channels to reduce calculation? This idea is exactly the idea of network pruning, but the difference occurs that, this paper generates a model who workes on different pruning rates.
For example, when we drop out 10% of its channels, it works well. When we drop out 25%, 50% or even 75%, it works still with reasonable accuracy loss.
Fig1
Just like the figure above, the model works on [1.0x 0.75x 0.5x 0.25x] width. Note that width here indicates the channels of network layers.

2.1.2 Realization

To realize this seemingly remarkable property, we have to train it on [1.0x 0.75x 0.5x 0.25x etc] conditions. Thus, we give the [1.0x 0.75x 0.5x 0.25x etc] a name : switch. For example, 0.25× represents that the width in all layers are scaled by 0.25 of the full model.
Then, it’s time to train the model. Here is the pseudo code:
Fig2

2.2 Switchable batch normalization

This model use independent batch normalization parameters for each switch. Why should we do that? The following figure gives the answer:
Fig3
Left shows the training error rate for both with and without S-BN(switchable batch normalization), and they looks almost the same. But the validation error on right shows that training without S-BN may cause a unstable result.

3 The performance – experiments and results

3.1 Performance on image classification task

Fig4
On MobileNet v2, ShuffleNet and ResNet-50, the slimmable neural network achieves comparative results. Under comparative accuracy and FLOPs, the slimmable network model is equivalent to 4 models at the same time.

3.2 Performance on object detection, instances segmentation adn keypoints detection

Fig5
The result here is almost the same as image classification task. For Faster-RCNN, Mask-RCNN and Keypoints-RCNN with ResNet-50 as a backbone on COCO 2017 dataset, the slimmable neural network achieves comparative result.

3.3 Performance under different umber of switches

Fig6
To analysize how the number of switches would impact the accuracy, the author trained a 8-switch neural network. And the comparasion on MobileNet v1 among individually trained model, 4-switch and 8-switch model shows that, the slimmable neural network is insensitive to number of switches.

1、资源项目源码均已通过严格测试验证,保证能够正常运行; 2、项目问题、技术讨论,可以给博主私信或留言,博主看到后会第一时间与您进行沟通; 3、本项目比较适合计算机领域相关的毕业设计课题、课程作业等使用,尤其对于人工智能、计算机科学与技术等相关专业,更为适合; 4、下载使用后,可先查看REAdMe.md或论文文件(如有),本项目仅用作交流学习参考,请切勿用于商业用途。 5、资源来自互联网采集,如有侵权,私聊博主删除。 6、可私信博主看论文后选择购买源代码。 1、资源项目源码均已通过严格测试验证,保证能够正常运行; 2、项目问题、技术讨论,可以给博主私信或留言,博主看到后会第一时间与您进行沟通; 3、本项目比较适合计算机领域相关的毕业设计课题、课程作业等使用,尤其对于人工智能、计算机科学与技术等相关专业,更为适合; 4、下载使用后,可先查看REAdMe.md或论文文件(如有),本项目仅用作交流学习参考,请切勿用于商业用途。 5、资源来自互联网采集,如有侵权,私聊博主删除。 6、可私信博主看论文后选择购买源代码。 1、资源项目源码均已通过严格测试验证,保证能够正常运行; 2、项目问题、技术讨论,可以给博主私信或留言,博主看到后会第一时间与您进行沟通; 3、本项目比较适合计算机领域相关的毕业设计课题、课程作业等使用,尤其对于人工智能、计算机科学与技术等相关专业,更为适合; 4、下载使用后,可先查看READme.md或论文文件(如有),本项目仅用作交流学习参考,请切勿用于商业用途。 5、资源来自互联网采集,如有侵权,私聊博主删除。 6、可私信博主看论文后选择购买源代码。
1、资源项目源码均已通过严格测试验证,保证能够正常运行; 2、项目问题、技术讨论,可以给博主私信或留言,博主看到后会第一时间与您进行沟通; 3、本项目比较适合计算机领域相关的毕业设计课题、课程作业等使用,尤其对于人工智能、计算机科学与技术等相关专业,更为适合; 4、下载使用后,可先查看README.md文件(如有),本项目仅用作交流学习参考,请切勿用于商业用途。 、 1资源项目源码均已通过严格测试验证,保证能够正常运行; 2、项目问题、技术讨论,可以给博主私信或留言,博主看到后会第一时间与您进行沟通; 3、本项目比较适合计算机领域相关的毕业设计课题、课程作业等使用,尤其对于人工智能、计算机科学与技术等相关专业,更为适合; 4、下载使用后,可先查看READmE.文件(md如有),本项目仅用作交流学习参考,请切勿用于商业用途。 1、资源项目源码均已通过严格测试验证,保证能够正常运行; 2、项目问题、技术讨论,可以给博主私信或留言,博主看到后会第一时间与您进行沟通; 3、本项目比较适合计算机领域相关的毕业设计课题、课程作业等使用,尤其对于人工智能、计算机科学与技术等相关专业,更为适合; 4、下载使用后,可先查看README.md文件(如有),本项目仅用作交流学习参考,请切勿用于商业用途。
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值