图像中的Attention代码(Tensorflow)

最近Attention广泛用于图像分割网络中,提升效果很明显。我也紧跟一波浪潮。这是基于Tensorflow的Attention实现。一块是针对区域Attention,一块是针对Channel的Attention。

def PAM_module(inputs):
    inputs_shape = inputs.get_shape().as_list()
    batchsize, height, width, C = inputs_shape[0], inputs_shape[1], inputs_shape[2], inputs_shape[3]
    filter = tf.Variable(tf.truncated_normal([1, 1, C, C//8], dtype=tf.float32, stddev=0.1), name='weights')
    filter1 = tf.Variable(tf.truncated_normal([1, 1, C, C], dtype=tf.float32, stddev=0.1), name='weights1')
    query_conv = tf.nn.conv2d(inputs, filter, strides=[1, 1, 1, 1], padding='VALID')
    key_conv = tf.nn.conv2d(inputs, filter, strides=[1, 1, 1, 1], padding='VALID')
    value_conv = tf.nn.conv2d(inputs, filter1, strides=[1, 1, 1, 1], padding='VALID')

    proj_query = tf.reshape(query_conv, [batchsize, width*height, -1])
    proj_key = tf.transpose((tf.reshape(key_conv, [batchsize, width * height, -1])), perm=[0, 2, 1])
    energy = tf.matmul(proj_query, proj_key)

    attention = tf.nn.softmax(energy)
    proj_value = tf.reshape(value_conv, [batchsize, width * height, -1 ])

    out = tf.matmul(attention, proj_value)
    out = tf.reshape(out, [batchsize, height, width, C ])
    out = out + inputs
    return out
def CAM_module(inputs):
    inputs_shape = inputs.get_shape().as_list()
    batchsize, height, width, C = inputs_shape[0], inputs_shape[1], inputs_shape[2], inputs_shape[3]

    proj_query = tf.transpose(tf.reshape(inputs, [batchsize, width*height, -1]), perm=[0, 2, 1])
    proj_key = tf.reshape(inputs, [batchsize, width*height, -1])
    energy = tf.matmul(proj_query, proj_key)
    energy_new = tf.maximum(energy, -1)-energy

    attention = tf.nn.softmax(energy_new)
    proj_value = tf.transpose(tf.reshape(inputs, [batchsize, width * height, -1 ]), perm=[0, 2, 1])

    out = tf.transpose(tf.matmul(attention, proj_value), perm=[0, 2, 1])
    out = (tf.reshape(out, [batchsize, height, width, C]))
    out = out + inputs
    return out
评论 10
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值