ResNet
Inception V4
Inception模块介绍
1.了解到batchnorm构成的ResNet残差结构可以加快网络收敛
2.inception的不同感受野可以充分提取抽象特征,且在层数更高处(特征越抽象)使用,效果越好
3.将1和2结合起来组成一个模块,相当于构成Inception-ResNet-v1的Stem和Inception-resnet-A组合模块
4.组合模块间用max_pooling池化,达到通道数不断增加,网络格格越来越小
5.对于神经网络中点的18张切片,对每张切片卷积处理,再多感受野特征提取,最后组合,会好一点
6.inception模块与depthwise 的mobilenetv2不融合,会导致模型发散
7.普通卷积比较正常,但是收敛慢,需用到batchnorm加速收敛
inception的不同感受野:
def features(input,name,scale):
with tf.variable_scope(name):
branch_0 = conv_1x1(input,16*scale,name='branch_0_0')
branch_1_0 = conv_1x1(input,12*scale,name='branch_1_0')
branch_1 = conv2d(branch_1_0,16*scale,3,1,name='branch_1')
branch_2_0 = conv_1x1(input,16*scale,name='branch_2_0')
branch_2_1 = conv2d(branch_2_0,24*scale,3,1,name='branch_2_1')
branch_2 = conv2d(branch_2_1,24*scale,3,1,name='branch_2')
branch_3_0 = avg_pooling(input)
branch_3 = conv_1x1(branch_3_0,8*scale,name='branch_3')
net=tf.concat([branch_0,branch_1,branch_2,branch_3],3) #(10,64,64,256)
return net
ResNet残差结构(两层结构):
def ResNet0(input,output_dim,is_train,name):
with tf.variable_scope(name):
net = conv2d(input, output_dim,3,1,name='ResNet0_conv2d0')
net = batch_norm(net, train=is_train, name='ResNet0_bn0')
net = relu(net)
net = conv2d(net, output_dim,3,1,name='ResNet0_conv2d1')
net = batch_norm(net, train=is_train, name='ResNet0_bn1')
in_dim=int(input.get_shape().as_list()[-1])
if in_dim != output_dim:
ins=conv_1x1(input, output_dim, name='ex_dim')
net=ins+net
else:
net=input+net
return relu(net)
ResNet残差结构(三层结构)待考证:
def ResNet1(input,output_dim,stride,is_train,name):
with tf.variable_scope(name):
depth=input.get_shape().as_list()[-1]
net = conv2d(input, output_dim,1,1,name='ResNet1_conv2d0')
net = batch_norm(net, train=is_train, name='ResNet1_bn0')
net = relu(net)
net = conv2d(net, output_dim,3,stride,name='ResNet1_conv2d1')
net = batch_norm(net, train=is_train, name='ResNet1_bn1')
net = relu(net)
net = conv2d(net, depth,1,1,name='ResNet1_conv2d2')
net = batch_norm(net, train=is_train, name='ResNet1_bn2')
if stride==1:
if depth != output_dim:
ins=conv_1x1(input, depth, name='ex_dim')
net=ins+net
else:
net=input+net
return relu(net)
组合模块:
conv2_0=ResNet0(x_c,128,is_train,name='conv2_0') ##juanjijiasu
net=features(conv2_0,name='branch_0',scale=2) ##tezhentiqu
net = batch_norm(net+conv2_0, train=is_train, name='bn_0')
net=relu(net)
conv2_0 = maxpool2d(net, k=2,name='maxpool2d_0')#(10,16,16,128)