使用tensorboard
1. 保存数据
定义图中:
#保存单个数据。例如:损失函数值,准确率
tf.summary.scalar("loss", cross_entropy)
#保存多个数据。例如:权重
tf.summary.histogram("W1",W)
#把所有的tf.summary.xx融合成一个操作
merged_summary_op = tf.summary.merge_all()
#声明保存的地方和图
summary_writer = tf.summary.FileWriter('/tmp/mnist_logs', graph_def=sess.graph_def)
#为变量名划定范围,在范围内的op在可视化中显示成可以展开的一整块。
with tf.name_scope('hidden') as scope:
W = tf.Variable(tf.random_uniform([1, 2], -1.0, 1.0), name='weights')
b = tf.Variable(tf.zeros([1]), name='biases')
#结果是得到了操作名: hidden/weights hidden/biases
运行图中:
for i in range(1000):
total_step += 1
batch_xs, batch_ys = mnist.train.next_batch(100)
sess.run(train_step, feed_dict={x: batch_xs, y_: batch_ys})
if total_step % 100 == 0:
//运行op
summary_str = sess.run(merged_summary_op,feed_dict={x: batch_xs, y_: batch_ys})
//保存
summary_writer.add_summary(summary_str, total_step)
2. 打开tensorboard
- 在cmd输入:tensorboard --logdir=D://tmp//mnist_logs(保存结果的路径)
- 复制生成的链接,然后在Google浏览器中打开(别的浏览器可能出错)。
3. 同时显示训练和测试曲线
设置两个writer,一个用于写训练的数据,一个用于写测试数据
import tensorflow as tf
......
tf.summary.scalar("loss", loss)
merged_summary_op = tf.summary.merge_all()
with tf.Session() as sess:
summary_writer1 = tf.summary.FileWriter('../log/view/train', sess.graph)
summary_writer2 = tf.summary.FileWriter('../log/view/test')
summary_str = sess.run(merged_summary_op, feed_dict={x: train_data, y_: train_label})
summary_writer1.add_summary(summary_str, step)
# summary_writer1.flush()
summary_str = sess.run(merged_summary_op, feed_dict={x: val_data, y_: val_label})
summary_writer2.add_summary(summary_str, step)
- 在cmd输入:tensorboard --logdir=…\log\view