pytorch-激活函数
因为激活函数在pytorch中会多个调用方式,做一个整理。
总结下来主要使用两个torch.nn.下的类, torch.nn.functional.下的函数。
tanh
有:torch.nn.functional.tanh,troch.nn.Tanh,torch.tanh, torch.Tensor.tanh,共四个,主要是利用torch.nn.Tanh
torch.nn.Tanh() :类
m = nn.Tanh()
input = torch.randn(2)
output = m(input) # tensor([0.4688, 0.9303])
torch.nn.functional.tanh(input) : 函数
torch.nn.functional.tanh(torch.tensor([1.1,1.2,1.3])) # tensor([0.8005, 0.8337, 0.8617])
torch.tanh(input) :函数,返回一个新Tensor
a = torch.randn(4)
torch.tanh(a) # tensor([-0.9382, -0.6782, -0.7915, -0.7089])
torch.Tensor.tanh(input) :函数, 官方解释和torch.tanh一样,返回一个新Tensor
a = torch.randn(4)
torch.Tensor.tanh(a) # tensor([ 0.4444, -0.9104, -0.9536, -0.8732])
sigmoid
torch.nn.Sigmoid() :类
m = nn.Sigmoid()
input = torch.randn(2)
output = m(input) # tensor([0.5604, 0.4958])
torch.nn.functional.sigmoid(input) :函数
input = torch.randn(2)
torch.nn.functional.sigmoid(input) # tensor([0.5447, 0.4489])
torch.sigmoid(input) :函数
input = torch.randn(2)
torch.sigmoid(input) # tensor([0.5460, 0.5107])
torch.Tensor.sigmoid(input) : 函数
input = torch.randn(2)
torch.Tensor.sigmoid(input) # tensor([0.4828, 0.3209])
Relu
torch.nn.Relu() : 类
input = torch.randn(2)
m = torch.nn.ReLU()
m(input) # tensor([0.0000, 0.3668])
torch.nn.functional.relu(input) : 函数
input = torch.randn(2)
torch.nn.functional.relu(input) # tensor([0., 0.])
torch.relu(input) : 函数
input = torch.randn(2)
torch.relu(input) # tensor([0.0000, 1.0291])
torch.Tensor.relu(input) : 函数
input = torch.randn(2)
torch.Tensor.relu(input) # tensor([0.0000, 1.1773])
Softmax
torch.nn.Softmax(dim=None) : 类
m = nn.Softmax(dim=1)
input = torch.randn(2, 3)
output = m(input) # tensor([[0.1456, 0.4903, 0.3641], [0.4711, 0.3158, 0.2131]])
torch.nn.functional.softmax(input, dim=None): 函数
input = torch.randn(2, 3)
torch.nn.functional.softmax(input, dim=-1) # tensor([[0.0901, 0.3959, 0.5140],[0.5739, 0.1069, 0.3192]])
torch.softmax(input, dim=None): 函数
input = torch.randn(2, 3)
torch.softmax(input, dim=-1) # tensor([[0.0071, 0.9044, 0.0886],[0.6710, 0.3048, 0.0242]])
torch.Tensor.softmax(input, dim=None):函数
input = torch.randn(2, 3)
torch.Tensor.softmax(input, dim=-1)