Activation function in Neural Network

Logistic / Sigmoid function

g(x)=11+ex=ex1+ex g ( x ) = 1 1 + e − x = e x 1 + e x
g(x)=11+ex=ex1+ex g ( − x ) = 1 1 + e x = e − x 1 + e − x
g(x)+g(x)=1 g ( x ) + g ( − x ) = 1
g(0)=12 g ( 0 ) = 1 2
limx+g(x)=1,limxg(x)=0 lim x → + ∞ g ( x ) = 1 , lim x → − ∞ g ( x ) = 0
g(x)=1(1+ex)2ex(1)=1(1+ex)2ex g ′ ( x ) = − 1 ( 1 + e − x ) 2 ⋅ e − x ⋅ ( − 1 ) = 1 ( 1 + e − x ) 2 e − x
=11+exex1+ex=g(x)g(x)=g(x)[1g(x)] = 1 1 + e − x ⋅ e − x 1 + e − x = g ( x ) g ( − x ) = g ( x ) [ 1 − g ( x ) ]
g(x)>0,xR g ′ ( x ) > 0 , x ∈ R
g′′(x)={g(x)[1g(x)]} g ″ ( x ) = { g ( x ) [ 1 − g ( x ) ] } ′
=g(x)[1g(x)]+g(x)[1g(x)] = g ′ ( x ) [ 1 − g ( x ) ] + g ( x ) [ 1 − g ( x ) ] ′
=g(x)[1g(x)]g(x)g(x) = g ′ ( x ) [ 1 − g ( x ) ] − g ( x ) g ′ ( x )
=g(x)[12g(x)] = g ′ ( x ) [ 1 − 2 g ( x ) ]
=g(x)[1g(x)][12g(x)] = g ( x ) [ 1 − g ( x ) ] [ 1 − 2 g ( x ) ]
g′′(x)<0,x>0,=0,x=0,>0,x<0, g ″ ( x ) { < 0 , x > 0 , = 0 , x = 0 , > 0 , x < 0 ,

tanh function

tanh(x)=2g(2x)1=exexex+ex tanh ⁡ ( x ) = 2 g ( 2 x ) − 1 = e x − e − x e x + e − x

tanh(x)=2g(2x)2=4g(2x)[1g(2x)]=[1+tanh(x)][1tanh(x)] tanh ′ ⁡ ( x ) = 2 g ′ ( 2 x ) ⋅ 2 = 4 g ( 2 x ) [ 1 − g ( 2 x ) ] = [ 1 + tanh ⁡ ( x ) ] [ 1 − tanh ⁡ ( x ) ]
=1[tanh(x)]2 = 1 − [ tanh ⁡ ( x ) ] 2

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值