一、基本
一次梯度更新算一个step
1.1 创建
# 这个其实是获取, 所以global_step必须之前有值,否则会报错。
tf.train.get_global_step(graph=None)
# 这个是创建.
tf.train.get_or_create_global_step(graph=None)
二、高级
1.1 手动更新
global_step = tf.train.get_or_create_global_step()
op = tf.assign(global_step, global_step+1)
1.2 自动更新
import tensorflow as tf
global_step = tf.train.get_or_create_global_step()
learning_rate = tf.constant(value=0.01, shape=[], dtype=tf.float32)
learning_rate = tf.train.polynomial_decay(learning_rate, global_step,
500, end_learning_rate=0.0,
power=2.0, cycle=True)
mean_loss = tf.Variable(1.0, dtype=tf.float32)
optimizer = tf.train.AdamOptimizer(learning_rate=learning_rate, beta1=0.9, beta2=0.98, epsilon=1e-8)
train_op = optimizer.minimize(mean_loss, global_step=global_step)
with tf.Session() as sess:
sess.run(tf.global_variables_initializer())
for step in range(10):
_, cur_step = sess.run([train_op, global_step])
print(cur_step)
在minimize的时候,会自动更新global_step。