交叉熵

tf.nn.softmax_cross_entropy_with_logits(_sentinel=None, labels=None, logits=None, dim=-1, name=None)

下面用个小例子来看看该函数的用法

import tensorflow as tf

labels = [[0.2,0.3,0.5],
[0.1,0.6,0.3]]
logits = [[2,0.5,1],
[0.1,1,3]]
logits_scaled = tf.nn.softmax(logits)

result1 = tf.nn.softmax_cross_entropy_with_logits(labels=labels, logits=logits)
result2 = -tf.reduce_sum(labels*tf.log(logits_scaled),1)
result3 = tf.nn.softmax_cross_entropy_with_logits(labels=labels, logits=logits_scaled)

with tf.Session() as sess:
print sess.run(result1)
print sess.run(result2)
print sess.run(result3)
>>>[ 1.41436887  1.66425455]
>>>[ 1.41436887  1.66425455]
>>>[ 1.17185783  1.17571414]


下面来看tf.nn.sparse_softmax_cross_entropy_with_logits(_sentinel=None, labels=None, logits=None, name=None)

返回值：长度为batch_size的一维Tensor, 和label的形状相同，和logits的类型相同

import tensorflow as tf

labels = [0,2]

logits = [[2,0.5,1],
[0.1,1,3]]

logits_scaled = tf.nn.softmax(logits)

result1 = tf.nn.sparse_softmax_cross_entropy_with_logits(labels=labels, logits=logits)

with tf.Session() as sess:
print sess.run(result1)
>>>[ 0.46436879  0.17425454]