我发现一个问题,当你使用Tensorboard进行可视化操作时: 如果你定义了
MERGED = tf.summary.merge_all();
这个操作,之后如果你单独使用SESS.run([MERGED]),那么就会报上面的这个错误;
此时你应该改成和其他的猪op一起进行SESS.run([TRAIN,MERGED]), 改了之后就不会再报这个错误,
具体原因我也很难解释清楚。之前针对这个错误,查了挺长时间,有一些解决方法,但都没有解决我的问题:
https://stackoverflow.com/questions/35114376/error-when-computing-summaries-in-tensorflow
https://blog.csdn.net/lyrassongs/article/details/75012464
后来我是参考了一份Github上一份程序,按它的样子改才改过来了。
# -*- coding: utf-8 -*-
"""
Created on Wed Oct 31 17:07:38 2018
@author: LiZebin
"""
from __future__ import print_function
import numpy as np
import tensorflow as tf
tf.reset_default_graph()
SESS = tf.Session()
LOGDIR = "logs/"
X = np.arange(0, 1000, 2, dtype=np.float32)
Y = X*2.3+5.6
X_ = tf.placeholder(tf.float32, name="X")
Y_ = tf.placeholder(tf.float32, name="Y")
W = tf.get_variable(name="Weights", shape=[1],
dtype=tf.float32, initializer=tf.random_normal_initializer())
B = tf.get_variable(name="bias", shape=[1],
dtype=tf.float32, initializer=tf.random_normal_initializer())
PRED = W*X_+B
LOSS = tf.reduce_mean(tf.square(Y_-PRED))
tf.summary.scalar("Loss", LOSS)
TRAIN = tf.train.GradientDescentOptimizer(learning_rate=0.0000001).minimize(LOSS)
WRITER = tf.summary.FileWriter(LOGDIR, SESS.graph)
MERGED = tf.summary.merge_all()
SESS.run(tf.global_variables_initializer())
for step in range(20000):
c1, c2, loss, RS, _ = SESS.run([W, B, LOSS, MERGED, TRAIN], feed_dict={X_:X, Y_:Y}) ####如果单独在后面写RS=SESS.run(MERGED)就会报之前那个错误
WRITER.add_summary(RS)
if step%500 == 0:
temp = "c1=%s, c2=%s, loss=%s"%(c1, c2, loss)
print(temp)
SESS.close()