对于交叉熵,一直以来都是直接运用公式,最近稍微了解了一下
1 公式
nn.CrossEntropyLoss结合nn.LogSoftmax和nn.NLLLoss(负对数似然),也就是先计算Softmax,再计算NLLLoss
手动计算
import torch
# 手动计算
y = torch.tensor([1,0,0])
z = torch.tensor([0.2,0.1,-0.1])
y_pred = np.exp(z) / np.exp(z).sum()
loss = (-y * np.log(y_pred)).sum()
print(loss) # tensor(0.9729)
使用nn.LogSoftmax和nn.NLLLoss
criterion = torch.nn.LogSoftmax()
z_tensor = torch.tensor([0.2, 0.1, -0.1])
z_tensor = criterion(z_tensor)
print(z_tensor) # tensor([-0.9729, -1.0729, -1.2729])
criterion = torch.nn.NLLLoss()
y_tensor = torch.LongTensor([0])
loss = criterion(z_tensor.reshape(1,3), y_tensor)
pr