torch.nn.ReLU
原型
CLASS torch.nn.ReLU(inplace=False)
参数
inplace ([bool]) – 内部运算,默认为 False
定义
ReLU ( x ) = ( x ) + = max ( 0 , x ) \text{ReLU}(x) = (x)^+ = \max(0, x) ReLU(x)=(x)+=max(0,x)
图
代码
import torch
import torch.nn as nn
m = nn.ReLU()
input = torch.randn(4)
output = m(input)
print("input: ", input) # input: tensor([ 1.5239, -0.5669, -2.8642, -0.0029])
print("output: ", output) # output: tensor([1.5239, 0.0000, 0.0000, 0.0000])