没有找到 torch.softmax、torch.nn.Softmax、torch.nn.funtial.softmax这几者的具体区别,但使用方法挺类似的,dim值设为 0 or -1,如果不加的话,torch.softmax会直接报错,另外几个会有deprecated弃用警告。log_softmax出来的结果是softmax的log值。
import torch
x = torch.rand(5)
x1 = torch.softmax(x, dim=-1)
x2 = torch.nn.Softmax(dim=-1)(x)
x3 = torch.nn.functional.softmax(x, dim=-1)
x4 = torch.nn.functional.log_softmax(x, dim=-1)
print(x1)
print(x2)
print(x3)
print(x4)
print(torch.log(x3))
# 随机输出:
# tensor([0.1252, 0.2240, 0.1123, 0.2652, 0.2733])
# tensor([0.1252, 0.2240, 0.1123, 0.2652, 0.2733])
# tensor([0.1252, 0.2240, 0.1123, 0.2652, 0.2733])
# tensor([-2.0781, -1.4959, -2.1864, -1.3272, -1.2973])
# tensor([-2.0781, -1.4959, -2.1864, -1.3272, -1.2973])