torch.nn.LeakyReLU
原型
CLASS torch.nn.LeakyReLU(negative_slope=0.01, inplace=False)
参数
- negative_slope (float) – 控制负值斜率,默认为
1e-2
- inplace (bool) – in-place 操作,默认为
False
定义
LeakyReLU ( x ) = max ( 0 , x ) + negative_slope ∗ min ( 0 , x ) \text{LeakyReLU}(x) = \max(0, x) + \text{negative\_slope} * \min(0, x) LeakyReLU(x)=max(0,x)+negative_slope∗min(0,x)
or
LeakyReLU ( x ) = { x , if x ≥ 0 negative_slope × x , otherwise \text{LeakyReLU}(x) = \begin{cases} x, & \text{ if } x \geq 0 \\ \text{negative\_slope} \times x, & \text{ otherwise } \end{cases} LeakyReLU(x)={x,negative_slope×x, if x≥0 otherwise
图
代码
import torch
import torch.nn as nn
m = nn.LeakyReLU(0.1)
input = torch.randn(2)
output = m(input)
print("input: ", input) # input: tensor([-1.5754, 0.6229])
print("output: ", output) # output: tensor([-0.1575, 0.6229])