根据@user728291的建议,我可以使用optimize_loss函数查看tensorboard中的渐变,如下所示。
optimize_loss的函数调用语法是optimize_loss(
loss,
global_step,
learning_rate,
optimizer,
gradient_noise_scale=None,
gradient_multipliers=None,
clip_gradients=None,
learning_rate_decay_fn=None,
update_ops=None,
variables=None,
name=None,
summaries=None,
colocate_gradients_with_ops=False,
increment_global_step=True
)
函数需要global_step,并且依赖于其他一些导入,如下所示。from tensorflow.python.ops import variable_scope
from tensorflow.python.framework import dtypes
from tensorflow.python.ops import init_ops
global_step = variable_scope.get_variable( # this needs to be defined for tf.contrib.layers.optimize_loss()
"global_step", [],
trainable=False,
dtype=dtypes.int64,
initializer=init_ops.constant_initializer(0, dtype=dtypes.int64))
然后替换你的典型训练操作training_operation = optimizer.minimize(loss_operation)
与training_operation = tf.contrib.layers.optimize_loss(
loss_operation, global_step, learning_rate=rate, optimizer='Adam',
summaries=["gradients"])
然后为你的摘要准备一个合并语句summary = tf.summary.merge_all()
在每次跑步/历元结束时的tensorflow训练中:summary_writer = tf.summary.FileWriter(logdir_run_x, sess.graph)
summary_str = sess.run(summary, feed_dict=feed_dict)
summary_writer.add_summary(summary_str, i)
summary_writer.flush() # evidently this is needed sometimes or scalars will not show up on tensorboard.
其中,logdir_run_x是每次运行的不同目录。这样,当TensorBoard运行时,您可以分别查看每个运行。渐变将位于“直方图”选项卡下,并具有标签OptimizeLoss。它将所有权重、所有偏差和beta参数显示为直方图。
更新:使用tf slim,还有另一种方法也可以工作,而且可能更干净。optimizer = tf.train.AdamOptimizer(learning_rate = rate)
training_operation = slim.learning.create_train_op(loss_operation, optimizer,summarize_gradients=True)
通过设置summarize_gradients=True(这不是默认值),您将获得所有权重的渐变摘要。这些将在Tensorboard中的summarize_grads下可见