感受野的计算公式如下:
r1=1,strides=1
rn = rn-1 + (fn - 1)*strides rn表示第n层的感受野 fn表示第n层卷积核大小 striden表示前n-1层stride乘积
striden = strides= stride1*stride2*...*striden-1
这里对Dilated/Atrous Convolution的感受野计算也进行下实现
其实Dilated/Atrous Convolution理解起来很简单多孔卷积,仅仅是为了增加感受野,参数不变
当dilation = 1时就和普通卷积一样
如果使用了dilation卷积相当于变大了,如下图所示,只有红色的点是起到作用的,红色点之间的值都是0。
所以对于使用了dilation的卷积,卷积核相当于 从k变成了 (k - 1) * d + 1
其中第一层dilation=1感受野和普通卷积一致为3*3
第一层dilation=1和第二层dilation=2组合感受野就是7*7
第一层dilation=1和第二层dilation=2在加上第三层dilation=4组合感受野就是15*15
代码实现如下:
"""
striden = strides= stride1*stride2*...*striden-1
rn = rn-1 + (fn - 1)*strides
"""
net_struct = {
'alexnet': {'net': [[11, 4, 1, 0], [3, 2, 1, 0], [5, 1, 1, 2], [3, 2, 1, 0], [3, 1, 1, 1], [3, 1, 1, 1], [3, 1, 1, 1], [3, 2, 1, 0]],
'name': ['conv1', 'pool1', 'conv2', 'pool2', 'conv3', 'conv4', 'conv5', 'pool5']},
'dilated_conv': {'net': [[3, 1, 1, 0], [3, 1, 2, 0], [3, 1, 4, 0]],
'name': ['dilated1', 'dilated2', 'dilated4']}
}
def calc_respective_fields(net):
layers = net['net']
layers_num = len(layers)
result = []
rf = 1
strides = 1
for i in range(layers_num):
# 卷积核大小/stride/dilation/padding
f, s, d, p = layers[i]
# 扩大卷积
f = (f - 1) * d + 1
rf = rf + (f - 1) * strides
strides *= s
result.append([rf, strides])
return result
if __name__ == '__main__':
net = net_struct['alexnet']
print('alextnet')
result = calc_respective_fields(net)
for i in range(len(result)):
print('alexnet %s layer output respective field %s strides %s' % (net['name'][i], result[i][0], result[i][1]))
net = net_struct['dilated_conv']
print('dilated_conv')
result = calc_respective_fields(net)
for i in range(len(result)):
print('alexnet %s layer output respective field %s strides %s' % (net['name'][i], result[i][0], result[i][1]))
alextnet
alexnet conv1 layer output respective field 11 strides 4
alexnet pool1 layer output respective field 19 strides 8
alexnet conv2 layer output respective field 51 strides 8
alexnet pool2 layer output respective field 67 strides 16
alexnet conv3 layer output respective field 99 strides 16
alexnet conv4 layer output respective field 131 strides 16
alexnet conv5 layer output respective field 163 strides 16
alexnet pool5 layer output respective field 195 strides 32
dilated_conv
alexnet dilated1 layer output respective field 3 strides 1
alexnet dilated2 layer output respective field 7 strides 1
alexnet dilated4 layer output respective field 15 strides 1
与君共勉~