由于需要写一些TF的代码,忘记共享机制实现,复习并写了一下。
1. 共享机制
TF主要提供 Variable Scope 这种独特的机制来共享变量
> #创建或返回给定名称的变量
> tf.get_variable(<name>, <shape>, <initializer>)
>
> #管理传给get_variable()的变量名称的作用域
> tf.variable_scope(<scope_name>)
2. 实例
Step 1:创建变量
def conv_relu(input, kernel_shape, bias_shape):
# Create variable named "weights".
weights = tf.get_variable("weights", kernel_shape,
initializer=tf.random_normal_initializer())
# Create variable named "biases".
biases = tf.get_variable("biases", bias_shape,
initializer=tf.constant_initializer(0.0))
conv = tf.nn.conv2d(input, weights,
strides=[1, 1, 1, 1], padding='SAME')
return tf.nn.relu(conv + biases)
Step 2:定义作用域:这个部分是可以复用的模块,可以是卷积等操作。
def my_image_filter(input_images):
with tf.variable_scope("conv1"):
# Variables created here will be named "conv1/weights", "conv1/biases".
relu1 = conv_relu(input_images, [5, 5, 32, 32], [32])
with tf.variable_scope("conv2"):
# Variables created here will be named "conv2/weights", "conv2/biases".
return conv_relu(relu1, [5, 5, 32, 32], [32])
Step3:复用
这里一定注意,复用某个卷积块的时候一定记得使用:scope.reuse_variables()
with tf.variable_scope("image_filters") as scope:
result1 = my_image_filter(image1)
scope.reuse_variables() # 最关键的一句
result2 = my_image_filter(image2) # 这样的话image2与image1就可以使用相同的模块了