keras 的 example 文件 mnist_net2net.py 解析

该程序是介绍,如何把一个浅层的卷积神经网络,加深,加宽

如先建立一个简单的神经网络,结构如下:

_________________________________________________________________
Layer (type)                 Output Shape              Param #
=================================================================
conv1 (Conv2D)               (None, 28, 28, 64)        640
_________________________________________________________________
pool1 (MaxPooling2D)         (None, 14, 14, 64)        0
_________________________________________________________________
conv2 (Conv2D)               (None, 14, 14, 64)        36928
_________________________________________________________________
pool2 (MaxPooling2D)         (None, 7, 7, 64)          0
_________________________________________________________________
flatten (Flatten)            (None, 3136)              0
_________________________________________________________________
fc1 (Dense)                  (None, 64)                200768
_________________________________________________________________
fc2 (Dense)                  (None, 10)                650
=================================================================
Total params: 238,986
Trainable params: 238,986
Non-trainable params: 0
_________________________________________________________________
None

训练完成后,想办法把他加宽,成下面这样

_________________________________________________________________
Layer (type)                 Output Shape              Param #
=================================================================
conv1 (Conv2D)               (None, 28, 28, 128)       1280
_________________________________________________________________
pool1 (MaxPooling2D)         (None, 14, 14, 128)       0
_________________________________________________________________
conv2 (Conv2D)               (None, 14, 14, 64)        73792
_________________________________________________________________
pool2 (MaxPooling2D)         (None, 7, 7, 64)          0
_________________________________________________________________
flatten (Flatten)            (None, 3136)              0
_________________________________________________________________
fc1 (Dense)                  (None, 128)               401536
_________________________________________________________________
fc2 (Dense)                  (None, 10)                1290
=================================================================
Total params: 477,898
Trainable params: 477,898
Non-trainable params: 0
_________________________________________________________________
None

或者加深,变成下面这样

_________________________________________________________________
Layer (type)                 Output Shape              Param #
=================================================================
conv1 (Conv2D)               (None, 28, 28, 64)        640
_________________________________________________________________
pool1 (MaxPooling2D)         (None, 14, 14, 64)        0
_________________________________________________________________
conv2 (Conv2D)               (None, 14, 14, 64)        36928
_________________________________________________________________
conv2-deeper (Conv2D)        (None, 14, 14, 64)        36928
_________________________________________________________________
pool2 (MaxPooling2D)         (None, 7, 7, 64)          0
_________________________________________________________________
flatten (Flatten)            (None, 3136)              0
_________________________________________________________________
fc1 (Dense)                  (None, 64)                200768
_________________________________________________________________
fc1-deeper (Dense)           (None, 64)                4160
_________________________________________________________________
fc2 (Dense)                  (None, 10)                650
=================================================================
Total params: 280,074
Trainable params: 280,074
Non-trainable params: 0
_________________________________________________________________
None

也就是介绍如何对神经网络参数进行增、改、查

首先是获取参数,获取卷积层参数和全连接层代码就是下面两行:

    w_conv1, b_conv1 = teacher_model.get_layer('conv1').get_weights()
    w_fc1, b_fc1 = teacher_model.get_layer('fc1').get_weights()

加宽的话,修改卷积层和全连接层参数是下面两行:

    model.get_layer('conv1').set_weights([new_w_conv1, new_b_conv1])
    model.get_layer('fc1').set_weights([new_w_fc1, new_b_fc1])

至于改成什么数据,那就自己可以自由发挥了,要么在原来的基础上,拼接随机的一些层,要么把原来的复制一份然后加一些噪音

 

加深的话,就是新建一个神经网络,把原有的层的参数获取重新拷贝过去就行了,新增加的层的参数,可以自由发挥如何初始化,

 

修改后的神经网络重新再进行训练

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值