参考:
The Activation in Deep Learning 浅谈深度学习中的激活函数
1. Sigmoid:
f ( x ) = 1 1 + e − x f(x) = \frac{1}{1+e^{-x}} f(x)=1+e−x1
2. Tanh:
f ( x ) = 1 − e − 2 x 1 + e − 2 x f(x) = \frac{1-e^{-2x}}{1+e^{-2x}} f(x)=1+e−2x1−e−2x
3. ReLU:
f ( x ) = { 0 ( x ≤ 0 ) x ( x > 0 ) f(x) = \begin{cases} 0 & (x \leq 0)\\ x & (x > 0) \\ \end{cases} f(x)={0x(x≤0)(x>0)