深度学习是非线性的转化,但是通过数学公式 y = W x + b y=Wx+b y=Wx+b的转化还是线性的,只有加入激活函数之后才是非线性的。
- 逻辑函数(Sigmoid)
ϕ ( x ) = 1 1 + e − x \phi(x)=\frac{1}{1+e^{-x}} ϕ(x)=1+e−x1
- 正切函数
ϕ ( x ) = tanh ( x ) = 1 − e − 2 x 1 + e − 2 x \phi(x)=\tanh(x)=\frac{1-e^{-2x}}{1+e^{-2x}} ϕ(x)=tanh(x)=1+e−2x1−e−2x
- Relu
ϕ ( x ) = max ( 0 , x ) \phi(x)=\max(0,x) ϕ(x)=max(0,x)
- Elu
ϕ ( α , x ) = { α ( e x − 1 ) , i f ( x < 0 ) x , e l s e \phi(\alpha,x)=\left\{\begin{array}{ll} \alpha(e^x-1)&,if(x<0)\\ x&,else \end{array} \right. ϕ(α,x)={α(ex−1)x,if(x<0),else
- softplus
ϕ ( x ) = l n ( 1 + e x ) \phi(x)=ln(1+e^x) ϕ(x)=ln(1+ex)
绘图的代码如下:
import matplotlib.pyplot as plt
import numpy as np
from math import exp
def plot_sigmoid():
x = np.linspace(-10, 10, 200)
y = 1 / (1 + np.exp(-x))
plt.plot(x, y)
plt.title('sigmoid')
plt.show()
def plot_tanh():
x = np.linspace(-10, 10, 200)
y = (1 - np.exp(-2 * x)) / (1 + np.exp(-2 * x))
plt.plot(x, y)
plt.title('tanh')
plt.show()
def plot_relu():
x = np.linspace(-10, 10, 200)
y = [xx if xx > 0 else 0 for xx in list(x)]
plt.plot(x, y)
plt.title('relu')
plt.show()
def plot_elu(alpha=1):
x = np.linspace(-10, 10, 200)
y = [alpha * (exp(xx) - 1) if xx < 0 else xx for xx in list(x)]
plt.plot(x, y)
plt.title('elu')
plt.show()
def plot_softplus():
x = np.linspace(-10, 10, 200)
y = np.log(1 + np.exp(x))
plt.plot(x, y)
plt.title('softplus')
plt.show()
if __name__ == "__main__":
plot_softplus()