ReLU Layer 是DL中非线性激活的一种,常常在卷积、归一化层后面(当然这也不是一定的)
首先我们先看一下 ReLUParameter
// Message that stores parameters used by ReLULayer
message ReLUParameter {
// Allow non-zero slope for negative inputs to speed up optimization
// Described in:
// Maas, A. L., Hannun, A. Y., & Ng, A. Y. (2013). Rectifier nonlinearities
// improve neural network acoustic models. In ICML Workshop on Deep Learning
// for Audio, Speech, and Language Processing.
optional float negative_slope = 1 [default = 0]; //x负方向的斜率,relu为0,若不为0,则就是relu的变种
enum Engine {
DEFAULT = 0;
CAFFE = 1;
CUDNN = 2;
}
optional Engine engine = 2 [default = DEFAULT];
}
ReLU Layer 在prototxt里面的书写:
layer {
name: "relu"
type: "ReLU"
bottom: "conv/bn"
top: "conv/bn"
}
例如在Mobilenet中:
layer {
name: "relu6_4"
type: "ReLU"
bottom: "conv6_4/bn"
top: "conv6_4/bn"
}