模型介绍参看:博文
可以看出ResNet系列最多层数达到了152层,但是基本结构可以分为四个模块,即特征层分别为64,128,256,512的卷积层block;
每个卷积层block中如上图,由两组卷积层由两层卷积核大小为3x3组成,每一层采取了标准化(normalization)处理,激活函数是relu,第二个relu在残差处理后进行;
但是根据不同深度, 每个卷积层模块的block数量存在差异,可以分为两组(18,34)与(50,101,152);
因此建立基本模块类建立ResNet是更优的选择。
迁移学习
康康标准答案,这里是ResNet152
import tensorflow as tf
from tensorflow import keras
base_model = keras.applications.ResNet152(weights='imagenet')
base_model.summary()
截取第5卷积层的第3个block的网络结果,
自建模型
提供两种不同的建模方法,keras搭建深度学习模型的若干方法:博文
ResNet50 101 152 models.Model建模方法
import tensorflow as tf
from tensorflow import keras
from tensorflow.keras import layers, models, Sequential
from tensorflow.keras.layers import Input, Conv2D, MaxPooling2D, Dense, Flatten, Dropout, BatchNormalization, Activation, GlobalAveragePooling2D
# 继承Layer,建立resnet50 101 152卷积层模块
def conv_block(inputs, filter_num, stride=1, name=None):
x = inputs
x = Conv2D(filter_num[0], (1,1), strides=stride, padding='same', name=name+'_conv1')(x)
x = BatchNormalization(axis=3, name=name+'_bn1')(x)
x = Activation('relu', name=name+'_relu1')(x)
x = Conv2D(filter_num[1], (3,3), strides=1, padding='same', name=name+'_conv2')(x)
x = BatchNormalization(axis=3, name=name+'_bn2')(x)
x = Activation('relu', name=name+'_relu2')(x)
x = Conv2D(filter_num[2], (1,1), strides=1, padding='same', name=name+'_conv3')(x)
x = BatchNormalization(axis=3, name=name+'_bn3')(x)
# residual connection
r = Conv2D(filter_num[2], (1,1), strides=stride, padding='same', name=name+'_residual')(inputs)
x = layers.add([x, r])
x = Activation('relu', name=name+'_relu3')(x)
return x
def build_block (x, filter_num, blocks, stride=1, name=None):
x = conv_block(x, filter_num, stride, name=name)
for i in range(1, blocks):
x = conv_block(x, filter_num, stride=1, name=name+'_block'+str(i))
return x
# 创建resnet50 101 152
def ResNet(Netname, nb_classes):
R