softmax_loss的真正名字应该是softmax cross entropy loss。因为softmax的定义是
f ( z i ) = s o f t m a x ( z i ) = e z i ∑ j e z j f(z_i)=softmax(z_i)=\frac{e^{z_i}}{\sum_je^{z_j}} f(zi)=softmax(zi)=∑jezjezi, softmax loss的定义是
L = − 1 N ∑ i = 0 N L i = − 1 N ∑ i = 0 N l o g f ( z i ) L=-\frac{1}{N}\sum_{i=0}^NL_i=-\frac{1}{N}\sum_{i=0}^{N}logf(z_i) L=−N1∑i=0NLi=−N1∑i=0N
softmax_loss梯度推导
最新推荐文章于 2023-09-28 16:36:57 发布
