import tensorflow as tf
import numpy as np
a = np.array(range(16))
b = a.reshape([1,2,2,4])
with tf.Session() as sess:
print(sess.run(tf.nn.lrn(b)))
b: 形状大小:[1,2,2,4]
array([[[[ 0, 1, 2, 3],
[ 4, 5, 6, 7]],
[[ 8, 9, 10, 11],
[12, 13, 14, 15]]]])
LRN之后:
[[[[0. 0.2581989 0.5163978 0.7745967 ]
[0.35494262 0.44367826 0.53241396 0.6211496 ]]
[[0.4175966 0.46979618 0.5219958 0.5741953 ]
[0.44262663 0.47951218 0.5163977 0.5532833 ]]]]
3/np.sqrt(1+1+4+9) = 0.7745966692414834
8/np.sqrt(1+64+81+100+121) = 0.41759660077750443
函数:
tf.nn.lrn(input, depth_radius=5, bias=1, alpha=1, beta=0.5, name=None)
sqr_sum[a, b, c, d] =
sum(input[a, b, c, d - depth_radius : d + depth_radius + 1] ** 2)
output = input / (bias + alpha * sqr_sum) ** beta
depth_radius是选取的大小
bias是分母加的
alpha是权重
beta是分母的次数