ReLU
torch.nn.ReLU(inplace=False)
参数
inplace –(可以选择性地原地操作) can optionally do the operation in-place. Default: False
shape:
示例
>>> m = nn.ReLU()
>>> input = torch.randn(2)
>>> output = m(input)
An implementation of CReLU - https://arxiv.org/abs/1603.05201
>>> m = nn.ReLU()
>>> input = torch.randn(2).unsqueeze(0)
>>> output = torch.cat((m(input),m(-input)))
Sigmoid
torch.nn.Sigmoid
shape
示例:
>>> m = nn.Sigmoid()
>>> input = torch.randn(2)
>>> output = m(input)
Tanh
torch.nn.Tanh
示例:
>>> m = nn.Tanh()
>>> input = torch.randn(2)
>>> output = m(input)