调用resnet50权重,将下采样层添加到Resnet50预训练模型

I am using keras 1.1.1 in windows 7 with tensorflow backend.

I am trying to prepend the stock Resnet50 pretained model with an image downsampler. Below is my code.

from keras.applications.resnet50 import ResNet50

import keras.layers

# this could also be the output a different Keras model or layer

input = keras.layers.Input(shape=(400, 400, 1)) # this assumes K.image_dim_ordering() == 'tf'

x1 = keras.layers.AveragePooling2D(pool_size=(2,2))(input)

x2 = keras.layers.Flatten()(x1)

x3 = keras.layers.RepeatVector(3)(x2)

x4 = keras.layers.Reshape((200, 200, 3))(x3)

x5 = keras.layers.ZeroPadding2D(padding=(12,12))(x4)

m = keras.models.Model(input, x5)

model = ResNet50(input_tensor=m.output, weights='imagenet', include_top=False)

but I get an error which I am unsure how to fix.

builtins.Exception: Graph disconnected: cannot obtain value for tensor

Output("input_2:0", shape=(?, 400, 400, 1), dtype=float32) at layer

"input_2". The following previous layers were accessed without issue:

[]

解决方案

You can use both the Functional API and Sequential approaches to solve this. See working example for both approaches below:

from keras.applications.ResNet50 import ResNet50

from keras.models import Sequential, Model

from keras.layers import AveragePooling2D, Flatten, RepeatVector, Reshape, ZeroPadding2D, Input, Dense

pretrained = ResNet50(input_shape=(224, 224, 3), weights='imagenet', include_top=False)

# Sequential method

model_1 = Sequential()

model_1.add(AveragePooling2D(pool_size=(2,2),input_shape=(400, 400, 1)))

model_1.add(Flatten())

model_1.add(RepeatVector(3))

model_1.add(Reshape((200, 200, 3)))

model_1.add(ZeroPadding2D(padding=(12,12)))

model_1.add(pretrained)

model_1.add(Dense(1))

# functional API method

input = Input(shape=(400, 400, 1))

x = AveragePooling2D(pool_size=(2,2),input_shape=(400, 400, 1))(input)

x = Flatten()(x)

x = RepeatVector(3)(x)

x = Reshape((200, 200, 3))(x)

x = ZeroPadding2D(padding=(12,12))(x)

x = pretrained(x)

preds = Dense(1)(x)

model_2 = Model(input,preds)

model_1.summary()

model_2.summary()

The summaries (replace resnet for xception):

_________________________________________________________________

Layer (type) Output Shape Param #

=================================================================

average_pooling2d_1 (Average (None, 200, 200, 1) 0

_________________________________________________________________

flatten_1 (Flatten) (None, 40000) 0

_________________________________________________________________

repeat_vector_1 (RepeatVecto (None, 3, 40000) 0

_________________________________________________________________

reshape_1 (Reshape) (None, 200, 200, 3) 0

_________________________________________________________________

zero_padding2d_1 (ZeroPaddin (None, 224, 224, 3) 0

_________________________________________________________________

xception (Model) (None, 7, 7, 2048) 20861480

_________________________________________________________________

dense_1 (Dense) (None, 7, 7, 1) 2049

=================================================================

Total params: 20,863,529

Trainable params: 20,809,001

Non-trainable params: 54,528

_________________________________________________________________

_________________________________________________________________

Layer (type) Output Shape Param #

=================================================================

input_2 (InputLayer) (None, 400, 400, 1) 0

_________________________________________________________________

average_pooling2d_2 (Average (None, 200, 200, 1) 0

_________________________________________________________________

flatten_2 (Flatten) (None, 40000) 0

_________________________________________________________________

repeat_vector_2 (RepeatVecto (None, 3, 40000) 0

_________________________________________________________________

reshape_2 (Reshape) (None, 200, 200, 3) 0

_________________________________________________________________

zero_padding2d_2 (ZeroPaddin (None, 224, 224, 3) 0

_________________________________________________________________

xception (Model) (None, 7, 7, 2048) 20861480

_________________________________________________________________

dense_2 (Dense) (None, 7, 7, 1) 2049

=================================================================

Total params: 20,863,529

Trainable params: 20,809,001

Non-trainable params: 54,528

_________________________________________________________________

Both approaches work fine. If you plan on freezing the pretrained model and letting pre/post layers learn -- and afterward finetuning the model, the approach I found to work goes like so:

# given the same resnet model as before...

model = load_model('modelname.h5')

# pull out the nested model

nested_model = model.layers[5] # assuming the model is the 5th layer

# loop over the nested model to allow training

for l in nested_model.layers:

l.trainable=True

# insert the trainable pretrained model back into the original

model.layer[5] = nested_model

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值