博客中列举了四种激活函数,分别是sigmoid,tanh,Relu,Leaky Relu;对于初学机器学习是基本够用的在线新回归使用sigmoid做非线性变换讲y值映射到0-1空间;
Name | Function | Derivative |
sigmoid | g(z)(1-g(z)) | |
tanh | tanh(z) | 1-(tanh(z))^2 |
0, if z < 0 | ||
Relu | max(0,z) | 1, if z>0 |
undefined,if z = 0 | ||
0.01, if z<0 | ||
Leaky Relu | max(0.01z,z) | 1, if z>0 |
undefine, if z = 0 |
2022/7/8