t a n h ( z ) = e z − e − z e z + e − z tanh(z)=\frac{e^z-e^{-z}}{e^z+e^{-z}} tanh(z)=ez+e−zez−e−z效果严格地比 s i g m o i d sigmoid sigmoid函数好,因为该函数的对称中心在 ( 0 , 0 ) (0,0) (0,0),具有将数据归一化为0均值的效果。当然,二分类的输出层的激活函数还是一般用 s i g m o i d ( z ) sigmoid(z) sigmoid(z),因为 s i g m o d sigmod sigmod函数能将输出值映射到 0 ∼ 1 0\sim1 0∼1之间(概率值)
R e l u ( z ) = m a x ( 0 , z ) Relu(z)=max(0,z) Relu(z)=max(0,z)出现后,神经网络默认都用 R e l u Relu Relu函数(rectified linear)来作为激活函数。此时一般默认 z > 0 z>0 z>0
l e a k y ( z ) = m a x ( 0.01 z , z ) leaky(z)=max(0.01z,z) leaky(z)=max(0.01z,z)可以避免 z < 0 z<0 z<0时斜率为零的情况 输出层有时也用线性激活函数(房价预测)
0. Linear Activate Function
https://towardsdatascience.com/activation-functions-neural-networks-1cbd9f8d91d6
也叫identity activate Function, 即对原始输出不做任何变换
1. Sigmoid activation function
a = g ( z ) = e z − e − z e z + e − z (2-1) \begin{aligned} a&=g(z) \\ &=\frac{e^{z}-e^{-z}}{e^{z}+e^{-z}}\\ \tag{2-1} \end{aligned} a=g(z)=ez+e−zez−e−z(2-1)
g ′ ( z ) = d d z g ( z ) = e − z 1 + e − z = ( e z + e − z ) 2 − ( e z − e − z ) 2 ( e z + e − z ) 2 = 1 − ( g ( z ) ) 2 = 1 − a 2 (2-2) \begin{aligned} g'(z)&=\frac{d}{dz}g(z)\\ &=\frac{e^{-z}}{1+e^{-z}}\\ &=\frac{\left(e^{z}+e^{-z}\right)^2-\left(e^z-e^{-z}\right)^2}{\left(e^z+e^{-z}\right)^2}\\ &=1-\left(g(z)\right)^2\\ &=1-a^2\\ \tag{2-2} \end{aligned} g′(z)=dzdg(z)=1+e−ze−z=(ez+e−z)2(ez+e−z)2−(ez−e−z)2=1−(g(z))2=1−a2(2-2)
3. ReLU and Leaky ReLU
ReLU:
a
=
g
(
z
)
=
m
a
x
(
0
,
z
)
(3-1)
\begin{aligned} a&=g(z) \\ &=max(0,z)\\ \tag{3-1} \end{aligned}
a=g(z)=max(0,z)(3-1)
g
′
(
z
)
=
d
d
z
g
(
z
)
=
{
0
i
f
z
<
0
1
i
f
z
≥
0
(3-2)
\begin{aligned} g'(z)&=\frac{d}{dz}g(z)\\ &=\left\{ \begin{aligned} 0\quad if\ z<0\\ 1\quad if\ z\geq0 \end{aligned} \right. \tag{3-2} \end{aligned}
g′(z)=dzdg(z)={0if z<01if z≥0(3-2)
Leaky ReLU:
a
=
g
(
z
)
=
m
a
x
(
0.01
z
,
z
)
(3-3)
\begin{aligned} a&=g(z) \\ &=max(0.01z,z)\\ \tag{3-3} \end{aligned}
a=g(z)=max(0.01z,z)(3-3)
g
′
(
z
)
=
d
d
z
g
(
z
)
=
{
0.01
i
f
z
<
0
1
i
f
z
≥
0
(3-4)
\begin{aligned} g'(z)&=\frac{d}{dz}g(z)\\ &=\left\{ \begin{aligned} 0.01\quad if\ z<0\\ 1\quad if\ z\geq0 \end{aligned} \right. \tag{3-4} \end{aligned}
g′(z)=dzdg(z)={0.01if z<01if z≥0(3-4)
4.选择激活函数的准则
- 如果处理的问题是二分类问题,输出为0和1,那么输出层选择sigmoid函数,其他神经元选择ReLU(有时也可用tanh),理论上Leaky ReLU比ReLU好,但是实践中差不多。