tf.nn.softmax(
logits,
axis=None,
name=None,
dim=None
)
其实softmax的作用就是归一化,将不管多大范围的数据变到0-1之间,下面举例说明
#softmax 函数
import tensorflow as tf
import numpy as np
a = np.array([3.1, 0.2, -2.2, 1.1])
print('a = ', a)
with tf.Session() as sess:
print('softmax(a) =', sess.run(tf.nn.softmax(a)))
打印结果:
a = [ 3.1 0.2 -2.2 1.1]
softmax(a) = [0.83657499 0.04603105 0.00417584 0.11321811]