用numpy
实现最简单的前馈神经网络——反向网络建立篇
在前一篇文章中,已经初略的建立了前向神经网络,但是前向神经网络大量的前馈计算使其只有较低的速度,因此我们来建立反向神经网络。
本篇主要是公式
文章目录
链式法则
在学习偏导数的反向传递之前,需要有对链式法则有一定的了解
provide that y = y ( x ) and z = z ( y ) , then z ′ ( x ) = d z d x = d z d y d y d x = z ′ ( y ) ⋅ y ′ ( x ) . \text{provide that }y = y(x) \text{ and } z = z(y), \\ \text{then } z'(x) = \frac{
{\rm d} z}{
{\rm d}x} = \frac{
{\rm d}z}{
{\rm d}y} \frac{
{\rm d}y}{
{\rm d}x} = z'(y) \cdot y'(x). provide that y=y(x) and z=z(y),then z′(x)=dxdz=dydzdxdy=z′(y)⋅y′(x).
provide that u = u ( x , y ) , v = v ( x , y ) , w = w ( u , v ) , then w ′ ( x ) = ∂ w ∂ u ∂ u ∂ x + ∂ w ∂ v ∂ v ∂ x = w ′ ( u ) ⋅ u ′ ( x ) + w ′ ( v ) ⋅ v ′ ( x ) , w ′ ( y ) = ∂ w ∂ u ∂ u ∂ y + ∂ w ∂ v ∂ v ∂ y = w ′ ( u ) ⋅ u ′ ( y ) + w ′ ( v ) ⋅ v ′ ( y ) . \text{provide that }u = u(x,y),v = v(x,y),w = w(u,v), \\ \text{then } w'(x) = \frac{\partial w}{\partial u} \frac{\partial u}{\partial x} + \frac{\partial w}{\partial v} \frac{\partial v}{\partial x} = w'(u) \cdot u'(x) + w'(v) \cdot v'(x), \\ w'(y) = \frac{\partial w}{\partial u} \frac{\partial u}{\partial y} + \frac{\partial w}{\partial v} \frac{\partial v}{\partial y} = w'(u) \cdot u'(y) + w'(v) \cdot v'(y). \\ provide that u=u(x,y),v=v(x,y),w=w(u,v),then w′(x)=∂u∂w∂x∂u+∂v∂w∂x∂v=w′(u)⋅u′(x)+w′(v)⋅v′(x),w′(y)=∂u∂w∂y∂u+∂v∂w∂y∂v=w′(u)⋅u′(y)+w′(v)⋅v′(y).
对于前馈神经网络来说,链式法则可以这么用
B 1 = F 1 ( A 1 , A 2 , A 3 ) , B 2 = F 2 ( A 1 , A 2 , A 3 ) , B 3 = F 3 ( A 1 , A 2 , A 3 ) , B 4 = F 4 ( A 1 , A 2 , A 3 ) . B_1 = F_1(A_1,A_2,A_3), \\ B_2 = F_2(A_1,A_2,A_3), \\ B_3 = F_3(A_1,A_2,A_3), \\ B_4 = F_4(A_1,A_2,A_3). \\ B1=F1(A1,A2,A3),B2=F2(A1,A2,A3),B