1. 高级激活层
1.1. LeakyReLU层
LeakyRelU是修正线性单元( Rectified Linear Unit, ReLU)的特殊版本,当不激活时, LeakyReLU仍然会有非零输出值,从而获得一个小梯度,避免ReLU可能出现的神经元“死亡”现象。
keras.layers.advanced_activations.LeakyReLU(alpha=0.3)
输出shape与输入相同
keras.layers.advanced_activations.PReLU(init='zero', weights=None)
keras.layers.advanced_activations.LeakyReLU(alpha=0.3)
- alpha:大于0的浮点数,代表激活函数图像中第三象限线段的斜率
输出shape与输入相同
1.2. PReLU层
该层为参数化的ReLU( Parametric ReLU),表达式是: f(x) = alpha * x for x < 0 , f(x) = x for x>=0 ,此处的 alpha 为一个与xshape相同的可学习的参数向量。keras.layers.advanced_activations.PReLU(init='zero', weights=None)
- init: alpha的初始化函数
- weights: alpha的初始化值,为具有单个numpy array的list
1.3. ELU层
ELU层是指数线性单元( Exponential Linera Unit),表达式为: 该层为参数化的ReLU( Parametric ReLU),表达式是: f(x) = alpha * (exp(x) - 1.) for x < 0 , f(x) = x for x>=0
keras.layers.advanced_activations.ELU(alpha=1.0)
keras.layers.advanced_activations.ELU(alpha=1.0)
1.4. ParametricSoftplus层
该层是参数化的Softplus,表达式是: f(x) = alpha * log(1 + exp(beta * x))keras.layers.advanced_activations.ParametricSoftplus(alpha_init=0.2, beta_init=5.0, weights=None)
- alpha_init&