案例一:
import tensorflow as tf
tf.set_random_seed(42)
sess = tf.InteractiveSession()
a = tf.constant([1, 2, 3, 4, 5])
tf.initialize_all_variables().run()
a_shuf = tf.random_shuffle(a)
print(a.eval())
print(a_shuf.eval())
sess.close()
上述代码重复运行会产生不一样的结果
import tensorflow as tf
tf.set_random_seed(42)
sess = tf.InteractiveSession()
a = tf.constant([1, 2, 3, 4, 5])
tf.initialize_all_variables().run()
a_shuf = tf.random_shuffle(a,seed=42)
print(a.eval())
print(a_shuf.eval())
sess.close()
这段代码产生相同的结果
案例二:
# using global seed
tf.set_random_seed(1)
fc1_W = tf.Variable(tf.truncated_normal(shape=(2, 2), mean=0, stddev=1))
fc2_W = tf.Variable(tf.truncated_normal(shape=(2, 2), mean=0, stddev=1))
# initialize session
with tf.Session() as sess:
sess.run(tf.global_variables_initializer())
print('\n variables with global seed ')
print('round 2.0')
print(fc1_W.eval(sess))
print(fc2_W.eval(sess))
# new session
with tf.Session() as sess:
sess.run(tf.global_variables_initializer())
print('round 2.1')
print(fc1_W.eval(sess))
print(fc2_W.eval(sess))
第一次运行:
variables with global seed
round 2.0
[[-0.89710885 0.39287093]
[ 0.4009913 -1.9170585 ]]
[[ 1.2090957 -0.13654923]
[ 1.6384401 -0.18959242]]
round 2.1
[[-0.89710885 0.39287093]
[ 0.4009913 -1.9170585 ]]
[[ 1.2090957 -0.13654923]
[ 1.6384401 -0.18959242]]
第一次与第二次不断的创建图,其结果是相同的
第二次运行:
variables with global seed
round 2.0
[[ 1.1742289 0.03763932]
[ 0.7202809 -0.52002007]]
[[ 1.5818138 0.81615436]
[ 1.2647419 -0.6432518 ]]
round 2.1
[[ 1.1742289 0.03763932]
[ 0.7202809 -0.52002007]]
[[ 1.5818138 0.81615436]
[ 1.2647419 -0.6432518 ]]
第一次运行与第二次运行结果不相同;
结论:只靠tf.set_random_seed是没有办法做到消除随机性 ,该函数可以在一次运行中,创建的多个图的结果是一样的; 因为从本质上来讲,operation-level的初始值是一样的(因为初始化一次被多个图使用),而graph-level的随机种子相同,所以对于不同的图而言,不存在随机性;
随机性由两个随机种子确认(Operations that rely on a random seed actually derive it from two seeds: the graph-level and operation-level seeds来自 于tensorflow官网)