欢迎关注博主的公众号:happyGirl的异想世界。有更多干货还有技术讨论群哦~
将最后一个卷几块之前的卷积层全部冻结:
# set the first 15 layers (up to the last conv block)
# to non-trainable (weights will not be updated)
for layer in model.layers[:15]:
layer.trainable = False
# compile the model with a SGD/momentum optimizer
# and a very slow learning rate.
model.compile(loss='binary_crossentropy',
optimizer=optimizers.SGD(lr=1e-4, momentum=0.9),
metrics=['accuracy'])
如果想输出结果可以这样:
# 除了FC层,靠近FC层的一部分卷积层可参与参数训练,
# 一般来说,模型结构已经标明一个卷积块包含的层数,
# 在这里我们选择FREEZE_LAYERS为17,表示最后一个卷积块和FC层要参与参数训练
for layer in model.layers[:FREEZE_LAYERS]:
layer.trainable = False
for layer in model.layers[FREEZE_LAYERS:]:
layer.trainable = True
for layer in model.layers:
print("layer.trainable:", layer.trainable)
如果是for layer in model.layers[:-1]:
就是从第一层到倒数第一层之前的意思
也可以带有某些关键字的形式冻住:
for layer in model.layers:
layerName = str(layer.name)
if layerName.startswith("RNN_") or layerName.startswith("Final_"):
layer.trainable = False
if layerName.endswith("c1") or layerName.endswith("c2_1") or layerName.endswith("c2_2"):
layer.trainable = False
冻住后要重新编译一下:
#custom loss
def mycrossentropy(y_true, y_pred, e=0.1):
return (1-e)*K.categorical_crossentropy(y_pred,y_true) + e*K.categorical_crossentropy(y_pred, K.ones_like(y_pred)/num_classes)
model.compile(loss=mycrossentropy,
optimizer=keras.optimizers.Adadelta(),
metrics=['accuracy'])
参考:
https://blog.csdn.net/baimafujinji/article/details/80743814
https://www.cnblogs.com/hutao722/p/9546521.html
https://blog.csdn.net/sinat_24899403/article/details/87815582