1、Sigmoid
layer{
name: "encod1act"
bottom: "encode1"
top: "encode1neuron"
type: "Sigmoid"
}
2、ReLU/Retified-Linear and Leaky-ReLU
可选参数:negative_slope:默认为0。对标准的ReLU函数进行变化,如果设置了这个值,那么数据为负数时,就不再设置为0,而是原始数据乘以negative_slope。
layer{
name: "relu1"
type: "ReLU"
bottom: "pool1"
top: "pool1"
}
RELU层支持in-place计算,这意味着bottom输出和输入相同以避免内存的消耗。
3、TanH/Hyperbolic Tangent
layer{
name: "lauer"
bottom: "input"
top: "out"
type: "TanH"
}
4、Absolute Value
layer{
name: "layer"
bottom: "in"
out: "out"
type: "AbsVal"
}
5、Power
对每个输入数据进行幂运算
可选参数:
power: 默认为1
scale: 默认为1
shift: 默认为0
layer{
name: "layer"
bottom: "in"
top: "out"
type: "Power"
power_param{
power: 2
scale: 1
dhift: 0
}
}
6、BNLL
f(x) = log(1+exp(x))
layer{
name: "layer"
bottom: "in"
top: "out"
type: "BNLL"
}