学习参考文章_Inception-ResNet

ResNet
Inception V4
Inception模块介绍
1.了解到batchnorm构成的ResNet残差结构可以加快网络收敛
2.inception的不同感受野可以充分提取抽象特征,且在层数更高处(特征越抽象)使用,效果越好
3.将1和2结合起来组成一个模块,相当于构成Inception-ResNet-v1的Stem和Inception-resnet-A组合模块
4.组合模块间用max_pooling池化,达到通道数不断增加,网络格格越来越小
5.对于神经网络中点的18张切片,对每张切片卷积处理,再多感受野特征提取,最后组合,会好一点
6.inception模块与depthwise 的mobilenetv2不融合,会导致模型发散
7.普通卷积比较正常,但是收敛慢,需用到batchnorm加速收敛
inception的不同感受野:

def features(input,name,scale):
     with tf.variable_scope(name):
         branch_0   = conv_1x1(input,16*scale,name='branch_0_0')
         branch_1_0 = conv_1x1(input,12*scale,name='branch_1_0')
         branch_1   = conv2d(branch_1_0,16*scale,3,1,name='branch_1')
         branch_2_0 = conv_1x1(input,16*scale,name='branch_2_0')
         branch_2_1 = conv2d(branch_2_0,24*scale,3,1,name='branch_2_1')
         branch_2   = conv2d(branch_2_1,24*scale,3,1,name='branch_2')
         branch_3_0 = avg_pooling(input)
         branch_3   = conv_1x1(branch_3_0,8*scale,name='branch_3')
         net=tf.concat([branch_0,branch_1,branch_2,branch_3],3) #(10,64,64,256)
         return net

在这里插入图片描述
ResNet残差结构(两层结构):

def ResNet0(input,output_dim,is_train,name):
     with tf.variable_scope(name):
        net = conv2d(input, output_dim,3,1,name='ResNet0_conv2d0')
        net = batch_norm(net, train=is_train, name='ResNet0_bn0')
        net = relu(net)
        net = conv2d(net, output_dim,3,1,name='ResNet0_conv2d1')
        net = batch_norm(net, train=is_train, name='ResNet0_bn1')
        
        in_dim=int(input.get_shape().as_list()[-1])
        if in_dim != output_dim:
            ins=conv_1x1(input, output_dim, name='ex_dim')
            net=ins+net
        else:
            net=input+net
        return relu(net)

ResNet残差结构(三层结构)待考证:

def ResNet1(input,output_dim,stride,is_train,name):
     with tf.variable_scope(name):
        depth=input.get_shape().as_list()[-1]
        net = conv2d(input, output_dim,1,1,name='ResNet1_conv2d0')
        net = batch_norm(net, train=is_train, name='ResNet1_bn0')
        net = relu(net)
        
        net = conv2d(net, output_dim,3,stride,name='ResNet1_conv2d1')
        net = batch_norm(net, train=is_train, name='ResNet1_bn1')
        net = relu(net)
        
        net = conv2d(net, depth,1,1,name='ResNet1_conv2d2')
        net = batch_norm(net, train=is_train, name='ResNet1_bn2')
        if stride==1:  
            if depth != output_dim:
                ins=conv_1x1(input, depth, name='ex_dim')
                net=ins+net
            else:
                net=input+net
        return relu(net)

组合模块:

		conv2_0=ResNet0(x_c,128,is_train,name='conv2_0')  ##juanjijiasu
        net=features(conv2_0,name='branch_0',scale=2)      ##tezhentiqu
        net = batch_norm(net+conv2_0, train=is_train, name='bn_0')
        net=relu(net)
        conv2_0 = maxpool2d(net, k=2,name='maxpool2d_0')#(10,16,16,128)

在这里插入图片描述

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值