主要用到tf.summary函数。
比如对计算结果进行可视化。
z = tf.multiply(X,W)+b
tf.summary.histogram('z',z)
对loss进行可视化:
cost = tf.reduce_mean(tf.square(Y - z))
tf.summary.scalar('loss_function',cost)
然后在session中写入日志:
init = tf.global_variables_initializer()
with tf.Session() as sess:
sess.run(init)
merged_summary_op = tf.summary.merge_all() # 合并所有summary
# 创建summary_writer用于写入文件
summary_writer = tf.summary.FileWriter('log/summaries',sess.graph)
for epoch in range(training_epochs):
for (x,y) in zip(train_x,train_y):
sess.run(optimizer,feed_dict={X:x,Y:y})
# 生成summary
summary_str = sess.run(merged_summary_op,feed_dict={X:x,Y:y})
summary_writer.add_summary(summary_str,epoch)
最后在命令行中执行:
tensorboard --logdir 绝对路径\summaries
如果不能自动打开,就把地址输入到chrome浏览器中。
总结:
最主要的几句话:
tf.summary.histogram('z',z)
tf.summary.scalar('loss_function',cost)
# 一定要在session下
merged_summary_op = tf.summary.merge_all() # 合并所有summary
summary_writer = tf.summary.FileWriter('log/summaries',sess.graph) # 创建summary_writer用于写入文件
# 可选几个epoch保存一次
summary_str = sess.run(merged_summary_op,feed_dict={X:x,Y:y})
summary_writer.add_summary(summary_str,epoch)