5.1 ResNet50
ResNet:arXiv: 1512.03385v1 [cs.CV] 10 Dec 2015 ;
链接: 论文下载link.
ResNet的设计有如下特点:
- 卷积+shortcut路径,构成一个Residual Block;
- ResNet中,通过conv的stride=2进行降采样;
- 通过Average Pooling得到最终的特征,而不是通过全连接层;
- 每个卷积层之后都紧接着BatchNorm layer。
(1) 网络结构
6. Conv + BatchNorm + Relu
7. ResNet blocks
8. AvgPool + Fully Connected layers
Mobile blocks
(2) Identity blocks
input
↓
Conv 1x1 filter s=1
BN
RelU
↓
Conv 3x3 filter s=1
BN
RelU
↓
Conv 1x1 filter*4 s=1
BN
↓
+ input
RelU
(3) Projection blocks
input
↓
Conv 1x1, filter, s=1
BN
RelU
↓
Conv 3x3 ,filter1 ,s=2
BN
RelU
↓
Conv 1x1, filter*4, s=1
BN
↓
+ Conv(k=1,filter*4, s=2)(input)
RelU
(4) Model diagram
Conv(f=32,k=3,s=2,p=s)
BatchNormalization
ReLU
MaxPool2D
↓
resNet_block(x,f=64,r=3,s=1)
resNet_block(x,f=128,r=4,s=2)
resNet_block(x,f=256,r=6,s=2)
resNet_block(x,f=512,r=6,s=2)
↓
GlobalAvgPool2D()(x)
Dense(1000)
softmax
5.2 代码
(1) 代码流程
- 导入程序包
- 编写Conv + BatchNorm + Relu block
- 建立Identity blocks
- 建立Projection blocks
- 建立ResNet blocks
- 建立model
(2) 代码
# 1 Import
from keras import Model
from keras.utils import plot_model
from keras.layers import Input,Conv2D,BatchNormalization,ReLU,Add,MaxPool2D,GlobalAvgPool2D,Dense
# 2 Conv-BatchNorm-ReLU blocks
def conv_bachnorm_relu(x,filters,kernel_size,strides):
x = Conv2D(filters=filters,kernel_size=kernel_size,strides=strides,padding='same')(x)
x = BatchNormalization()(x)
x = ReLU()(x)
return x
# 3 Identity blocks
def identity_block(input,filters):
x = conv_bachnorm_relu(input,filters,kernel_size=1,strides=1)
x = conv_bachnorm_relu(x,filters,kernel_size=3,strides=1)
x = Conv2D(filters=filters*4,kernel_size=1,strides=1)(x)
x = BatchNormalization()(x)
x = Add()([x,input])
x = ReLU()(x)
return x
# 4 Projection blocks
def projection_block(input,filters,strides):
x = conv_bachnorm_relu(input,filters=filters,kernel_size=1,strides=1)
x = conv_bachnorm_relu(x,filters=filters,kernel_size=3,strides=strides)
x = Conv2D(filters=filters*4,kernel_size=1,strides=1)(x)
x = BatchNormalization()(x)
shortcut = Conv2D(filters=filters*4,kernel_size=1,strides=strides)(input)
shortcut = BatchNormalization()(shortcut)
x = Add()([x,shortcut])
x = ReLU()(x)
return x
# 5 ResNet blocks
def resnet_block(x,filters,reps,strides):
x = projection_block(x,filters=filters,strides=strides)
for _ in range(reps-1):
x = identity_block(x,filters=filters)
return x
# 6 Model
input = Input((224,224,3))
x = conv_bachnorm_relu(input,filters=64,kernel_size=7,strides=2) # (None, 224,224,3)-->(None, 112, 112, 64)
x = MaxPool2D(pool_size=3,strides=2,padding='same')(x) # (None, 112, 112, 64)-->(None, 56, 56, 64)
x = resnet_block(x,filters=64,reps=3,strides=1) # (None, 56, 56, 64) --> (None, 56, 56, 256)
x = resnet_block(x,filters=128,reps=4,strides=2) # (None, 56, 56, 256) --> (None,28, 28, 512)
x = resnet_block(x,filters=256,reps=6,strides=2) # (None, 28, 28, 512) --> (None, 14, 14, 1024)
x = resnet_block(x,filters=512,reps=3,strides=2) # (None, 14, 14, 1024) --> (None, 7, 7, 2048)
x = GlobalAvgPool2D()(x) # (None, 2048)
output = Dense(units=1000,activation='softmax')(x) # (None, 1000)
model = Model(inputs=input,outputs=output)
model.summary()
'''
__________________________________________________________________________________________________
Layer (type) Output Shape Param # Connected to
==================================================================================================
input_1 (InputLayer) (None, 224, 224, 3) 0
__________________________________________________________________________________________________
conv2d_1 (Conv2D) (None, 112, 112, 64) 9472 input_1[0][0]
__________________________________________________________________________________________________
batch_normalization_1 (BatchNor (None, 112, 112, 64) 256 conv2d_1[0][0]
__________________________________________________________________________________________________
re_lu_1 (ReLU) (None, 112, 112, 64) 0 batch_normalization_1[0][0]
__________________________________________________________________________________________________
max_pooling2d_1 (MaxPooling2D) (None, 56, 56, 64) 0 re_lu_1[0][0]
__________________________________________________________________________________________________
conv2d_2 (Conv2D) (None, 56, 56, 64) 4160 max_pooling2d_1[0][0]
__________________________________________________________________________________________________
batch_normalization_2 (BatchNor (None, 56, 56, 64) 256 conv2d_2[0][0]
__________________________________________________________________________________________________
re_lu_2 (ReLU) (None, 56, 56, 64) 0 batch_normalization_2[0][0]
__________________________________________________________________________________________________
conv2d_3 (Conv2D) (None, 56, 56, 64) 36928 re_lu_2[0][0]
__________________________________________________________________________________________________
batch_normalization_3 (BatchNor (None, 56, 56, 64) 256 conv2d_3[0][0]
__________________________________________________________________________________________________
re_lu_3 (ReLU) (None, 56, 56, 64) 0 batch_normalization_3[0][0]
__________________________________________________________________________________________________
conv2d_4 (Conv2D) (None, 56, 56, 256) 16640 re_lu_3[0][0]
__________________________________________________________________________________________________
conv2d_5 (Conv2D) (None, 56, 56, 256) 16640 max_pooling2d_1[0][0]
__________________________________________________________________________________________________
batch_normalization_4 (BatchNor (None, 56, 56, 256) 1024 conv2d_4[0][0]
__________________________________________________________________________________________________
batch_normalization_5 (BatchNor (None, 56, 56, 256) 1024 conv2d_5[0][0]
__________________________________________________________________________________________________
add_1 (Add) (None, 56, 56, 256) 0 batch_normalization_4[0][0]
batch_normalization_5[0][0]
__________________________________________________________________________________________________
re_lu_4 (ReLU) (None, 56, 56, 256) 0 add_1[0][0]
__________________________________________________________________________________________________
conv2d_6 (Conv2D) (None, 56, 56, 64) 16448 re_lu_4[0][0]
__________________________________________________________________________________________________
batch_normalization_6 (BatchNor (None, 56, 56, 64) 256 conv2d_6[0][0]
__________________________________________________________________________________________________
re_lu_5 (ReLU) (None, 56, 56, 64) 0 batch_normalization_6[0][0]
__________________________________________________________________________________________________
conv2d_7 (Conv2D) (None, 56, 56, 64) 36928 re_lu_5[0][0]
__________________________________________________________________________________________________
batch_normalization_7 (BatchNor (None, 56, 56, 64) 256 conv2d_7[0][0]
__________________________________________________________________________________________________
re_lu_6 (ReLU) (None, 56, 56, 64) 0 batch_normalization_7[0][0]
__________________________________________________________________________________________________
conv2d_8 (Conv2D) (None, 56, 56, 256) 16640 re_lu_6[0][0]
__________________________________________________________________________________________________
batch_normalization_8 (BatchNor (None, 56, 56, 256) 1024 conv2d_8[0][0]
__________________________________________________________________________________________________
add_2 (Add) (None, 56, 56, 256) 0 batch_normalization_8[0][0]
re_lu_4[0][0]
__________________________________________________________________________________________________
re_lu_7 (ReLU) (None, 56, 56, 256) 0 add_2[0][0]
__________________________________________________________________________________________________
conv2d_9 (Conv2D) (None, 56, 56, 64) 16448 re_lu_7[0][0]
__________________________________________________________________________________________________
batch_normalization_9 (BatchNor (None, 56, 56, 64) 256 conv2d_9[0][0]
__________________________________________________________________________________________________
re_lu_8 (ReLU) (None, 56, 56, 64) 0 batch_normalization_9[0][0]
__________________________________________________________________________________________________
conv2d_10 (Conv2D) (None, 56, 56, 64) 36928 re_lu_8[0][0]
__________________________________________________________________________________________________
batch_normalization_10 (BatchNo (None, 56, 56, 64) 256 conv2d_10[0][0]
__________________________________________________________________________________________________
re_lu_9 (ReLU) (None, 56, 56, 64) 0 batch_normalization_10[0][0]
__________________________________________________________________________________________________
conv2d_11 (Conv2D) (None, 56, 56, 256) 16640 re_lu_9[0][0]
__________________________________________________________________________________________________
batch_normalization_11 (BatchNo (None, 56, 56, 256) 1024 conv2d_11[0][0]
__________________________________________________________________________________________________
add_3 (Add) (None, 56, 56, 256) 0 batch_normalization_11[0][0]
re_lu_7[0][0]
__________________________________________________________________________________________________
re_lu_10 (ReLU) (None, 56, 56, 256) 0 add_3[0][0]
__________________________________________________________________________________________________
conv2d_12 (Conv2D) (None, 56, 56, 128) 32896 re_lu_10[0][0]
__________________________________________________________________________________________________
batch_normalization_12 (BatchNo (None, 56, 56, 128) 512 conv2d_12[0][0]
__________________________________________________________________________________________________
re_lu_11 (ReLU) (None, 56, 56, 128) 0 batch_normalization_12[0][0]
__________________________________________________________________________________________________
conv2d_13 (Conv2D) (None, 28, 28, 128) 147584 re_lu_11[0][0]
__________________________________________________________________________________________________
batch_normalization_13 (BatchNo (None, 28, 28, 128) 512 conv2d_13[0][0]
__________________________________________________________________________________________________
re_lu_12 (ReLU) (None, 28, 28, 128) 0 batch_normalization_13[0][0]
__________________________________________________________________________________________________
conv2d_14 (Conv2D) (None, 28, 28, 512) 66048 re_lu_12[0][0]
__________________________________________________________________________________________________
conv2d_15 (Conv2D) (None, 28, 28, 512) 131584 re_lu_10[0][0]
__________________________________________________________________________________________________
batch_normalization_14 (BatchNo (None, 28, 28, 512) 2048 conv2d_14[0][0]
__________________________________________________________________________________________________
batch_normalization_15 (BatchNo (None, 28, 28, 512) 2048 conv2d_15[0][0]
__________________________________________________________________________________________________
add_4 (Add) (None, 28, 28, 512) 0 batch_normalization_14[0][0]
batch_normalization_15[0][0]
__________________________________________________________________________________________________
re_lu_13 (ReLU) (None, 28, 28, 512) 0 add_4[0][0]
__________________________________________________________________________________________________
conv2d_16 (Conv2D) (None, 28, 28, 128) 65664 re_lu_13[0][0]
__________________________________________________________________________________________________
batch_normalization_16 (BatchNo (None, 28, 28, 128) 512 conv2d_16[0][0]
__________________________________________________________________________________________________
re_lu_14 (ReLU) (None, 28, 28, 128) 0 batch_normalization_16[0][0]
__________________________________________________________________________________________________
conv2d_17 (Conv2D) (None, 28, 28, 128) 147584 re_lu_14[0][0]
__________________________________________________________________________________________________
batch_normalization_17 (BatchNo (None, 28, 28, 128) 512 conv2d_17[0][0]
__________________________________________________________________________________________________
re_lu_15 (ReLU) (None, 28, 28, 128) 0 batch_normalization_17[0][0]
__________________________________________________________________________________________________
conv2d_18 (Conv2D) (None, 28, 28, 512) 66048 re_lu_15[0][0]
__________________________________________________________________________________________________
batch_normalization_18 (BatchNo (None, 28, 28, 512) 2048 conv2d_18[0][0]
__________________________________________________________________________________________________
add_5 (Add) (None, 28, 28, 512) 0 batch_normalization_18[0][0]
re_lu_13[0][0]
__________________________________________________________________________________________________
re_lu_16 (ReLU) (None, 28, 28, 512) 0 add_5[0][0]
__________________________________________________________________________________________________
conv2d_19 (Conv2D) (None, 28, 28, 128) 65664 re_lu_16[0][0]
__________________________________________________________________________________________________
batch_normalization_19 (BatchNo (None, 28, 28, 128) 512 conv2d_19[0][0]
__________________________________________________________________________________________________
re_lu_17 (ReLU) (None, 28, 28, 128) 0 batch_normalization_19[0][0]
__________________________________________________________________________________________________
conv2d_20 (Conv2D) (None, 28, 28, 128) 147584 re_lu_17[0][0]
__________________________________________________________________________________________________
batch_normalization_20 (BatchNo (None, 28, 28, 128) 512 conv2d_20[0][0]
__________________________________________________________________________________________________
re_lu_18 (ReLU) (None, 28, 28, 128) 0 batch_normalization_20[0][0]
__________________________________________________________________________________________________
conv2d_21 (Conv2D) (None, 28, 28, 512) 66048 re_lu_18[0][0]
__________________________________________________________________________________________________
batch_normalization_21 (BatchNo (None, 28, 28, 512) 2048 conv2d_21[0][0]
__________________________________________________________________________________________________
add_6 (Add) (None, 28, 28, 512) 0 batch_normalization_21[0][0]
re_lu_16[0][0]
__________________________________________________________________________________________________
re_lu_19 (ReLU) (None, 28, 28, 512) 0 add_6[0][0]
__________________________________________________________________________________________________
conv2d_22 (Conv2D) (None, 28, 28, 128) 65664 re_lu_19[0][0]
__________________________________________________________________________________________________
batch_normalization_22 (BatchNo (None, 28, 28, 128) 512 conv2d_22[0][0]
__________________________________________________________________________________________________
re_lu_20 (ReLU) (None, 28, 28, 128) 0 batch_normalization_22[0][0]
__________________________________________________________________________________________________
conv2d_23 (Conv2D) (None, 28, 28, 128) 147584 re_lu_20[0][0]
__________________________________________________________________________________________________
batch_normalization_23 (BatchNo (None, 28, 28, 128) 512 conv2d_23[0][0]
__________________________________________________________________________________________________
re_lu_21 (ReLU) (None, 28, 28, 128) 0 batch_normalization_23[0][0]
__________________________________________________________________________________________________
conv2d_24 (Conv2D) (None, 28, 28, 512) 66048 re_lu_21[0][0]
__________________________________________________________________________________________________
batch_normalization_24 (BatchNo (None, 28, 28, 512) 2048 conv2d_24[0][0]
__________________________________________________________________________________________________
add_7 (Add) (None, 28, 28, 512) 0 batch_normalization_24[0][0]
re_lu_19[0][0]
__________________________________________________________________________________________________
re_lu_22 (ReLU) (None, 28, 28, 512) 0 add_7[0][0]
__________________________________________________________________________________________________
conv2d_25 (Conv2D) (None, 28, 28, 256) 131328 re_lu_22[0][0]
__________________________________________________________________________________________________
batch_normalization_25 (BatchNo (None, 28, 28, 256) 1024 conv2d_25[0][0]
__________________________________________________________________________________________________
re_lu_23 (ReLU) (None, 28, 28, 256) 0 batch_normalization_25[0][0]
__________________________________________________________________________________________________
conv2d_26 (Conv2D) (None, 14, 14, 256) 590080 re_lu_23[0][0]
__________________________________________________________________________________________________
batch_normalization_26 (BatchNo (None, 14, 14, 256) 1024 conv2d_26[0][0]
__________________________________________________________________________________________________
re_lu_24 (ReLU) (None, 14, 14, 256) 0 batch_normalization_26[0][0]
__________________________________________________________________________________________________
conv2d_27 (Conv2D) (None, 14, 14, 1024) 263168 re_lu_24[0][0]
__________________________________________________________________________________________________
conv2d_28 (Conv2D) (None, 14, 14, 1024) 525312 re_lu_22[0][0]
__________________________________________________________________________________________________
batch_normalization_27 (BatchNo (None, 14, 14, 1024) 4096 conv2d_27[0][0]
__________________________________________________________________________________________________
batch_normalization_28 (BatchNo (None, 14, 14, 1024) 4096 conv2d_28[0][0]
__________________________________________________________________________________________________
add_8 (Add) (None, 14, 14, 1024) 0 batch_normalization_27[0][0]
batch_normalization_28[0][0]
__________________________________________________________________________________________________
re_lu_25 (ReLU) (None, 14, 14, 1024) 0 add_8[0][0]
__________________________________________________________________________________________________
conv2d_29 (Conv2D) (None, 14, 14, 256) 262400 re_lu_25[0][0]
__________________________________________________________________________________________________
batch_normalization_29 (BatchNo (None, 14, 14, 256) 1024 conv2d_29[0][0]
__________________________________________________________________________________________________
re_lu_26 (ReLU) (None, 14, 14, 256) 0 batch_normalization_29[0][0]
__________________________________________________________________________________________________
conv2d_30 (Conv2D) (None, 14, 14, 256) 590080 re_lu_26[0][0]
__________________________________________________________________________________________________
batch_normalization_30 (BatchNo (None, 14, 14, 256) 1024 conv2d_30[0][0]
__________________________________________________________________________________________________
re_lu_27 (ReLU) (None, 14, 14, 256) 0 batch_normalization_30[0][0]
__________________________________________________________________________________________________
conv2d_31 (Conv2D) (None, 14, 14, 1024) 263168 re_lu_27[0][0]
__________________________________________________________________________________________________
batch_normalization_31 (BatchNo (None, 14, 14, 1024) 4096 conv2d_31[0][0]
__________________________________________________________________________________________________
add_9 (Add) (None, 14, 14, 1024) 0 batch_normalization_31[0][0]
re_lu_25[0][0]
__________________________________________________________________________________________________
re_lu_28 (ReLU) (None, 14, 14, 1024) 0 add_9[0][0]
__________________________________________________________________________________________________
conv2d_32 (Conv2D) (None, 14, 14, 256) 262400 re_lu_28[0][0]
__________________________________________________________________________________________________
batch_normalization_32 (BatchNo (None, 14, 14, 256) 1024 conv2d_32[0][0]
__________________________________________________________________________________________________
re_lu_29 (ReLU) (None, 14, 14, 256) 0 batch_normalization_32[0][0]
__________________________________________________________________________________________________
conv2d_33 (Conv2D) (None, 14, 14, 256) 590080 re_lu_29[0][0]
__________________________________________________________________________________________________
batch_normalization_33 (BatchNo (None, 14, 14, 256) 1024 conv2d_33[0][0]
__________________________________________________________________________________________________
re_lu_30 (ReLU) (None, 14, 14, 256) 0 batch_normalization_33[0][0]
__________________________________________________________________________________________________
conv2d_34 (Conv2D) (None, 14, 14, 1024) 263168 re_lu_30[0][0]
__________________________________________________________________________________________________
batch_normalization_34 (BatchNo (None, 14, 14, 1024) 4096 conv2d_34[0][0]
__________________________________________________________________________________________________
add_10 (Add) (None, 14, 14, 1024) 0 batch_normalization_34[0][0]
re_lu_28[0][0]
__________________________________________________________________________________________________
re_lu_31 (ReLU) (None, 14, 14, 1024) 0 add_10[0][0]
__________________________________________________________________________________________________
conv2d_35 (Conv2D) (None, 14, 14, 256) 262400 re_lu_31[0][0]
__________________________________________________________________________________________________
batch_normalization_35 (BatchNo (None, 14, 14, 256) 1024 conv2d_35[0][0]
__________________________________________________________________________________________________
re_lu_32 (ReLU) (None, 14, 14, 256) 0 batch_normalization_35[0][0]
__________________________________________________________________________________________________
conv2d_36 (Conv2D) (None, 14, 14, 256) 590080 re_lu_32[0][0]
__________________________________________________________________________________________________
batch_normalization_36 (BatchNo (None, 14, 14, 256) 1024 conv2d_36[0][0]
__________________________________________________________________________________________________
re_lu_33 (ReLU) (None, 14, 14, 256) 0 batch_normalization_36[0][0]
__________________________________________________________________________________________________
conv2d_37 (Conv2D) (None, 14, 14, 1024) 263168 re_lu_33[0][0]
__________________________________________________________________________________________________
batch_normalization_37 (BatchNo (None, 14, 14, 1024) 4096 conv2d_37[0][0]
__________________________________________________________________________________________________
add_11 (Add) (None, 14, 14, 1024) 0 batch_normalization_37[0][0]
re_lu_31[0][0]
__________________________________________________________________________________________________
re_lu_34 (ReLU) (None, 14, 14, 1024) 0 add_11[0][0]
__________________________________________________________________________________________________
conv2d_38 (Conv2D) (None, 14, 14, 256) 262400 re_lu_34[0][0]
__________________________________________________________________________________________________
batch_normalization_38 (BatchNo (None, 14, 14, 256) 1024 conv2d_38[0][0]
__________________________________________________________________________________________________
re_lu_35 (ReLU) (None, 14, 14, 256) 0 batch_normalization_38[0][0]
__________________________________________________________________________________________________
conv2d_39 (Conv2D) (None, 14, 14, 256) 590080 re_lu_35[0][0]
__________________________________________________________________________________________________
batch_normalization_39 (BatchNo (None, 14, 14, 256) 1024 conv2d_39[0][0]
__________________________________________________________________________________________________
re_lu_36 (ReLU) (None, 14, 14, 256) 0 batch_normalization_39[0][0]
__________________________________________________________________________________________________
conv2d_40 (Conv2D) (None, 14, 14, 1024) 263168 re_lu_36[0][0]
__________________________________________________________________________________________________
batch_normalization_40 (BatchNo (None, 14, 14, 1024) 4096 conv2d_40[0][0]
__________________________________________________________________________________________________
add_12 (Add) (None, 14, 14, 1024) 0 batch_normalization_40[0][0]
re_lu_34[0][0]
__________________________________________________________________________________________________
re_lu_37 (ReLU) (None, 14, 14, 1024) 0 add_12[0][0]
__________________________________________________________________________________________________
conv2d_41 (Conv2D) (None, 14, 14, 256) 262400 re_lu_37[0][0]
__________________________________________________________________________________________________
batch_normalization_41 (BatchNo (None, 14, 14, 256) 1024 conv2d_41[0][0]
__________________________________________________________________________________________________
re_lu_38 (ReLU) (None, 14, 14, 256) 0 batch_normalization_41[0][0]
__________________________________________________________________________________________________
conv2d_42 (Conv2D) (None, 14, 14, 256) 590080 re_lu_38[0][0]
__________________________________________________________________________________________________
batch_normalization_42 (BatchNo (None, 14, 14, 256) 1024 conv2d_42[0][0]
__________________________________________________________________________________________________
re_lu_39 (ReLU) (None, 14, 14, 256) 0 batch_normalization_42[0][0]
__________________________________________________________________________________________________
conv2d_43 (Conv2D) (None, 14, 14, 1024) 263168 re_lu_39[0][0]
__________________________________________________________________________________________________
batch_normalization_43 (BatchNo (None, 14, 14, 1024) 4096 conv2d_43[0][0]
__________________________________________________________________________________________________
add_13 (Add) (None, 14, 14, 1024) 0 batch_normalization_43[0][0]
re_lu_37[0][0]
__________________________________________________________________________________________________
re_lu_40 (ReLU) (None, 14, 14, 1024) 0 add_13[0][0]
__________________________________________________________________________________________________
conv2d_44 (Conv2D) (None, 14, 14, 512) 524800 re_lu_40[0][0]
__________________________________________________________________________________________________
batch_normalization_44 (BatchNo (None, 14, 14, 512) 2048 conv2d_44[0][0]
__________________________________________________________________________________________________
re_lu_41 (ReLU) (None, 14, 14, 512) 0 batch_normalization_44[0][0]
__________________________________________________________________________________________________
conv2d_45 (Conv2D) (None, 7, 7, 512) 2359808 re_lu_41[0][0]
__________________________________________________________________________________________________
batch_normalization_45 (BatchNo (None, 7, 7, 512) 2048 conv2d_45[0][0]
__________________________________________________________________________________________________
re_lu_42 (ReLU) (None, 7, 7, 512) 0 batch_normalization_45[0][0]
__________________________________________________________________________________________________
conv2d_46 (Conv2D) (None, 7, 7, 2048) 1050624 re_lu_42[0][0]
__________________________________________________________________________________________________
conv2d_47 (Conv2D) (None, 7, 7, 2048) 2099200 re_lu_40[0][0]
__________________________________________________________________________________________________
batch_normalization_46 (BatchNo (None, 7, 7, 2048) 8192 conv2d_46[0][0]
__________________________________________________________________________________________________
batch_normalization_47 (BatchNo (None, 7, 7, 2048) 8192 conv2d_47[0][0]
__________________________________________________________________________________________________
add_14 (Add) (None, 7, 7, 2048) 0 batch_normalization_46[0][0]
batch_normalization_47[0][0]
__________________________________________________________________________________________________
re_lu_43 (ReLU) (None, 7, 7, 2048) 0 add_14[0][0]
__________________________________________________________________________________________________
conv2d_48 (Conv2D) (None, 7, 7, 512) 1049088 re_lu_43[0][0]
__________________________________________________________________________________________________
batch_normalization_48 (BatchNo (None, 7, 7, 512) 2048 conv2d_48[0][0]
__________________________________________________________________________________________________
re_lu_44 (ReLU) (None, 7, 7, 512) 0 batch_normalization_48[0][0]
__________________________________________________________________________________________________
conv2d_49 (Conv2D) (None, 7, 7, 512) 2359808 re_lu_44[0][0]
__________________________________________________________________________________________________
batch_normalization_49 (BatchNo (None, 7, 7, 512) 2048 conv2d_49[0][0]
__________________________________________________________________________________________________
re_lu_45 (ReLU) (None, 7, 7, 512) 0 batch_normalization_49[0][0]
__________________________________________________________________________________________________
conv2d_50 (Conv2D) (None, 7, 7, 2048) 1050624 re_lu_45[0][0]
__________________________________________________________________________________________________
batch_normalization_50 (BatchNo (None, 7, 7, 2048) 8192 conv2d_50[0][0]
__________________________________________________________________________________________________
add_15 (Add) (None, 7, 7, 2048) 0 batch_normalization_50[0][0]
re_lu_43[0][0]
__________________________________________________________________________________________________
re_lu_46 (ReLU) (None, 7, 7, 2048) 0 add_15[0][0]
__________________________________________________________________________________________________
conv2d_51 (Conv2D) (None, 7, 7, 512) 1049088 re_lu_46[0][0]
__________________________________________________________________________________________________
batch_normalization_51 (BatchNo (None, 7, 7, 512) 2048 conv2d_51[0][0]
__________________________________________________________________________________________________
re_lu_47 (ReLU) (None, 7, 7, 512) 0 batch_normalization_51[0][0]
__________________________________________________________________________________________________
conv2d_52 (Conv2D) (None, 7, 7, 512) 2359808 re_lu_47[0][0]
__________________________________________________________________________________________________
batch_normalization_52 (BatchNo (None, 7, 7, 512) 2048 conv2d_52[0][0]
__________________________________________________________________________________________________
re_lu_48 (ReLU) (None, 7, 7, 512) 0 batch_normalization_52[0][0]
__________________________________________________________________________________________________
conv2d_53 (Conv2D) (None, 7, 7, 2048) 1050624 re_lu_48[0][0]
__________________________________________________________________________________________________
batch_normalization_53 (BatchNo (None, 7, 7, 2048) 8192 conv2d_53[0][0]
__________________________________________________________________________________________________
add_16 (Add) (None, 7, 7, 2048) 0 batch_normalization_53[0][0]
re_lu_46[0][0]
__________________________________________________________________________________________________
re_lu_49 (ReLU) (None, 7, 7, 2048) 0 add_16[0][0]
__________________________________________________________________________________________________
global_average_pooling2d_1 (Glo (None, 2048) 0 re_lu_49[0][0]
__________________________________________________________________________________________________
dense_1 (Dense) (None, 1000) 2049000 global_average_pooling2d_1[0][0]
==================================================================================================
Total params: 25,636,712
Trainable params: 25,583,592
Non-trainable params: 53,120
__________________________________________________________________________________________________
Process finished with exit code 0
'''