先来张效果对比图:
Sigmoid
不过(0,0)点:
ELU(Exponential Linear Unit)
RELU (Rectified Linear Unit)
GELU(Gaussian Error Linear Units)
Swish 激活函数也叫Silu
转载自:
- https://mp.weixin.qq.com/s/BY5_NKrSOMQ0o4GxWUeYKA
- https://pytorch.org/docs/stable/generated/torch.nn.GELU.html
- https://blog.csdn.net/renwudao24/article/details/44465407