ExponentialMovingAverage使用方法
import tensorflow as tf
variable_averages = tf.train.ExponentialMovingAverage(learning_rate, global_step)
variables_averages_op = variable_averages.apply(tf.trainable_variables())
發生原因
tf.trainable_variables()會把所有graph中的可訓練的變數輸入ExponentialMovingAverage做出新的變數,例如用變數
“layer1-conv1/weight"做出"layer1-conv1/weight/ExponentialMovingAverage:0”,
假如"layer1-conv1/weight"當時宣告的方式是用tf.get_variable(),則ExponentialMovingAverage會繼承他的不可重複性,因此會出現ExponentialMovingAverage already exitsts。
解決方法
import tensorflow as tf
with tf.variable_scope(name_or_scope=tf.get_variable_scope(),reuse=tf.AUTO_REUSE):
variable_averages = tf.train.ExponentialMovingAverage(learning_rate, global_step)
variables_averages_op = variable_averages.apply(tf.trainable_variables())
注意:不要用reuse=True代替reuse=tf.AUTO_REUSE,因為會出現各種神奇的錯誤。
用name_or_scope=tf.get_variable_scope()會使"layer1-conv1/weight"變成
“layer1-conv1/weight/ExponentialMovingAverage:0”,用name_or_scope='test’會使"layer1-conv1/weight"變成"test/layer1-conv1/weight/ExponentialMovingAverage:0"