global_step经常在滑动平均,学习速率变化的时候需要用到,这个参数在
tf.train.GradientDescentOptimizer(learning_rate).minimize(loss, global_step=global_steps)里面有,系统会自动更新这个参数的值,从1开始。
请看实例:
import tensorflow as tf;
import numpy as np;
import matplotlib.pyplot as plt;
x = tf.placeholder(tf.float32, shape=[None, 1], name='x')
y = tf.placeholder(tf.float32, shape=[None, 1], name='y')
w = tf.Variable(tf.constant(0.0))
global_steps = tf.Variable(0, trainable=False)
learning_rate = tf.train.exponential_decay(0.1, global_steps, 10, 2, staircase=False)
loss = tf.pow(w*x-y, 2)
train_step = tf.train.GradientDescentOptimizer(learning_rate).minimize(loss, global_step=global_steps)
with tf.Session() as sess:
sess.run(tf.initialize_all_variables())
for i in range(10):
sess.run(train_step, feed_dict={x:np.linspace(1,4,10).reshape([10,1]),
y:np.linspace(1,4,10).reshape([10,1])})
print('learning_rate:{}'.format(sess.run(learning_rate)))
print('global_steps:{}'.format(sess.run(global_steps)))
输出:
learning_rate:0.10717733949422836
global_steps:1
learning_rate:0.11486983299255371
global_steps:2
learning_rate:0.1231144443154335
global_steps:3
learning_rate:0.13195079565048218
global_steps:4
learning_rate:0.1414213627576828
global_steps:5
learning_rate:0.15157166123390198
global_steps:6
learning_rate:0.16245047748088837
global_steps:7
learning_rate:0.17411011457443237
global_steps:8
learning_rate:0.18660660088062286
global_steps:9
learning_rate:0.20000000298023224
global_steps:10
学习速率第一次训练开始变化,global_steps每次自动加1

本文通过一个具体的TensorFlow示例介绍了global_step参数在训练过程中的作用及其自动更新机制,并展示了如何利用global_step来调整学习速率。
&spm=1001.2101.3001.5002&articleId=78801065&d=1&t=3&u=3ab40310606c43d48e5247dcc763ae3b)
3万+

被折叠的 条评论
为什么被折叠?



