TensorFlow 报错:InvalidArgumentError (see above for traceback): tags and values not the same shape: []

报错源码:

loss = tf.nn.sparse_softmax_cross_entropy_with_logits(labels=tf.argmax(labels_holder,1), logits=label_predict)
correct_predict = tf.equal(tf.argmax(label_predict, 1), tf.argmax(labels_holder, 1))
accuracy = tf.reduce_mean(input_tensor=tf.cast(x=correct_predict, dtype=tf.float32))
train_op = tf.train.GradientDescentOptimizer(learning_rate=self.lr).minimize(loss=loss)
tf.summary.scalar(name="loss", tensor=loss)
tf.summary.scalar(name="accuracy", tensor=accuracy)
merged = tf.summary.merge_all()
init_op = tf.global_variables_initializer()
with tf.Session() as sess:
    sess.run(init_op)
    writer = tf.summary.FileWriter(logdir="./summary/", graph=sess.graph)
    for i in range(self.epoch_num):
        batch_images, batch_labels = mnist.train.next_batch(self.batch_size)
        batch_images = tf.reshape(tensor=batch_images, shape=[self.batch_size, 28, 28, 1])
        batch_images = tf.image.resize_images(images=batch_images,size=(32,32))
        sess.run(train_op, feed_dict={images_holder:batch_images.eval(), labels_holder:batch_labels})
        accuracy_result = sess.run(accuracy, feed_dict={images_holder: batch_images.eval(), labels_holder: batch_labels})
        summary_result = sess.run(fetches=merged, feed_dict={images_holder: batch_images.eval(), labels_holder: batch_labels})
        writer.add_summary(summary=summary_result, global_step=i)

 

报错信息:

InvalidArgumentError (see above for traceback): tags and values not the same shape: [] != [100] (tag 'loss')
     [[node loss (defined at /Users/jiweiwang/Git Repository/machine_learning_basis/tensorflow_practice/image_recognition/classical_network/LeNet-5/LeNet-5.py:104)  = ScalarSummary[T=DT_FLOAT, _device="/job:localhost/replica:0/task:0/device:CPU:0"](loss/tags, SparseSoftmaxCrossEntropyWithLogits/SparseSoftmaxCrossEntropyWithLogits)]]

 

这里出错的原因是定义的 loss 采用 sparse_softmax_cross_entropy_with_logits

loss = tf.nn.sparse_softmax_cross_entropy_with_logits(labels=tf.argmax(labels_holder,1), logits=label_predict)

查阅 TensorFlow API:https://tensorflow.google.cn/api_docs/python/tf/losses/sparse_softmax_cross_entropy

sparse_softmax_cross_entropy_with_logits 返回:Weighted loss Tensor of the same type as logits. If reduction is NONE, this has the same shape as labels; otherwise, it is scalar.

即返回和输入标签具有相同形状的张量

 

解决方法:

将 tf.summary.scalar(name="loss", tensor=loss) 改为 tf.summary.histogram(name="loss", tensor=loss)

 

 

  • 2
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 打赏
    打赏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

csdn-WJW

你的鼓励将是我创作的最大动力

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值