二分类激活函数
Sigmoid
二分类loss函数
BCELoss
多分类激活函数
Softmax
LogSoftmax
多分类loss函数
NLLLoss
CrossEntropyLoss
LogSoftmax 等价于 torch.log 加 Softmax,但 LogSoftmax 的实现计算更快
import torch
from torch import nn
x = torch.randn(3, 5, requires_grad=True)
y = torch.empty(3, dtype=torch.long).random_(5)
softmax = nn.Softmax(dim=1)
log_softmax = nn.LogSoftmax(dim=1)
a1 = torch.log(softmax(x))
a2 = log_softmax(x)
print((a1 - a2).detach() < 1e-05)
CrossEntropyLoss 等价于 LogSoftmax 加 NLLLoss
import torch
from torch import nn
x = torch.randn(3, 5, requires_grad=True)
y = torch.empty(3, dtype=torch.long).random_(5)
log_softmax = nn.LogSoftmax(dim=1)
negative_log_likely_loss = nn.NLLLoss()
cross_entropy_loss = nn.CrossEntropyLoss()
v1 = negative_log_likely_loss(log_softmax(x), y)
v2 = cross_entropy_loss(x, y)
print(v1.item() == v2.item())
https://pytorch.org/tutorials/beginner/blitz/cifar10_tutorial.html#sphx-glr-beginner-blitz-cifar10-tutorial-py
https://www.cnblogs.com/marsggbo/p/10401215.html
https://www.cnblogs.com/ranjiewen/p/10059490.html
https://blog.csdn.net/hao5335156/article/details/80607732
https://blog.csdn.net/lyy14011305/article/details/88664518
https://blog.csdn.net/u011734144/article/details/79717470
—————————————————————————————————————————————————————
在二分类问题情景下,sigmoid 和 softmax 效果等价:
https://www.aiuai.cn/aifarm679.html
https://gist.github.com/ypwhs/6905ebbda99d04621f9fc00417657ae2