本章我做的矩阵乘法的地方与这本书使用相同,不过我所有向量都是列向量
Ch02 How the backpropagation algorithm works
在线书籍http://neuralnetworksanddeeplearning.com/chap2.html
目录
文章目录
The four fundamental equations behind backpropagation
- Alternate presentation of the equations of backpropagation: I’ve stated the equations of backpropagation (notably (BP11) and (BP22)) using the Hadamard product. This presentation may be disconcerting if you’re unused to the Hadamard product. There’s an alternative approach, based on conventional matrix multiplication, which some readers may find enlightening. (1) Show that (BP1) may be rewritten as
δ L = Σ ′ ( z L ) ∇ a C , \delta ^L = \Sigma'(z^L)\nabla_a C, δL=Σ′(zL)∇aC,
where Σ ′ ( z L ) \Sigma'(z^L) Σ′(zL) is a square matrix whose diagonal entries are the values σ ′ ( z j L ) \sigma'(z_j^L) σ′(zjL), and whose off-diagonal entries are zero. Note that this matrix acts on ∇ a C \nabla_a C ∇aC by conventional matrix multiplication. (2) Show that (BP2) may be rewritten as
δ l = Σ ′ ( z l ) ( w l + 1 ) T δ l + 1 \delta^l=\Sigma'(z^l)(w^{l+1})^T\delta^{l+1} δl=Σ′(zl)(wl+1)Tδl+1
(3) By combining observations (1) and (2) show that
δ l = Σ ′ ( z l ) ( w l + 1 ) T . . . Σ ′ ( z L − 1 ) ( w L ) T Σ ′ ( z L ) ∇ a C \delta^l = \Sigma'(z^l)(w^{l+1})^T ... \Sigma'(z^{L-1})(w^{L})^T\Sigma'(z^L)\nabla_aC δl=Σ′(zl)(wl+1)T...Σ′(zL−1)(wL)TΣ′(zL)∇aC
For readers comfortable with matrix multiplication this equation may be easier to understand than (BP1) and (BP2). The reason I’ve focused on (BP1) and (BP2) is because that approach turns out to be faster to implement numerically.
证明:(BP1)
Expected node of symbol group type, but got node of type cr
即 δ L = ( ∂ C ∂ a 1 L σ ′ ( z 1 L ) , ∂ C ∂ a 2 L σ ′ ( z 2 L ) , . . . , ∂ C ∂ a k L σ ′ ( z k L ) ) T \delta^L = \left(\frac{\partial C}{\partial a_1^L}\sigma '(z_1^L), \frac{\partial C}{\partial a_2^L}\sigma '(z_2^L), ... ,\frac{\partial C}{\partial a_k^L}\sigma '(z_k^L)\right)^T δL=(∂a1L∂Cσ′(z1L),∂a2L∂Cσ′(z2L),...,∂akL∂Cσ′(zkL))T
(BP2)
δ l = ( ( w l + 1 ) T δ l + 1 ) ⊙ σ ′ ( z l ) \delta^l = \left(\left(w^{l+1}\right)^T\delta^{l+1}\right)\odot \sigma'(z^l) δl=((wl+1)Tδl+1)⊙σ′(zl)
$=\left[\begin{array}\
w_{11}^{l+1} & w_{12}^{l+1} & … & w_{1k}^{l+1} \
w_{21}^{l+1} & w_{22}^{l+1} & … & w_{2k}^{l+1} \
… & … & …& … \
w_{j1}^{l+1} & w_{j2}^{l+1} & … & w_{jk}^{l+1} \\
\end{array}\right]^T
\cdot \left[\begin{array}\
\delta_1^{l+1} \
\delta_2^{l+1} \
… \
\delta_j^{l+1}
\end{array}\right] \odot\left[\begin{array}\
\sigma’(z_1^l) \
\sigma’(z_2^l) \
… \
\sigma’(z_k^l)
\end{array}\right] $
$=\left[\begin{array}\
\sum_j (w_{j1}{l+1}\delta_j{l+1})\cdot \sigma’(z_1^l) \
\sum_j (w_{j2}{l+1}\delta_j{l+1})\cdot \sigma’(z_2^l) \
… \
\sum_j (w_{jk}{l+1}\delta_j{l+1})\cdot \sigma’(z_k^l)
\end{array}\right] $
Σ ′ ( z l ) ( w l + 1 ) T δ l + 1 \Sigma'(z^l)(w^{l+1})^T\delta^{l+1} Σ′(zl)(wl+1)Tδl+1
$ = \left[\begin{array} \
\sigma’(z_1^L) & 0 & … & 0 \
0 & \sigma’(z_2^L) & … & 0 \
… & … & … & …\
0 & … & … & \sigma’(z_k^L)
\end{array}\right] \cdot \left[\begin{array}\
w_{11}^{l+1} & w_{12}^{l+1} & … & w_{1k}^{l+1} \
w_{21}^{l+1} & w_{22}^{l+1} & … & w_{2k}^{l+1} \
… & … & …& … \
w_{j1}^{l+1} & w_{j2}^{l+1} & … & w_{jk}^{l+1} \\
\end{array}\right]^T\cdot \left[\begin{array}\
\delta_1^{l+1} \
\delta_2^{l+1} \
… \
\delta_j^{l+1}
\end{array}\right]$
$ = \left[\begin{array}\
\sigma’(z_1^l) \cdot \sum_j (w_{j1}^{l+1} \delta_j^{l+1}) \
\sigma’(z_2^l) \cdot \sum_j (w_{j2}^{l+1} \delta_j^{l+1}) \
… \
\sigma’(z_k^l) \cdot \sum_j (w_{jk}^{l+1} \delta_j^{l+1})
\end{array}\right] = \delta^l$
其中第 l l l层有 k k k个神经元,第 l + 1 l+1 l+1层有 j j j个神经元, w j k l + 1 w_{jk}^{l+1} wjkl+1表示第 l l l层第 k k k个神经元对第 l + 1 l+1 l+1层第 j j j个神经元的作用。
δ l = Σ ′ ( z l ) ( w l + 1 ) T . . . Σ ′ ( z L − 1 ) ( w L ) T Σ ′ ( z L ) ∇ a C \delta^l = \Sigma'(z^l)(w^{l+1})^T ... \Sigma'(z^{L-1})(w^{L})^T\Sigma'(z^L)\nabla_aC δl=Σ′(zl)(wl+1)T...Σ′(zL−1)(wL)TΣ′(zL)∇aC
将 δ l + 1 \delta^{l+1} δl+1迭代为 δ l + 1 = Σ ′ ( z l + 1 ) ( w l + 2 ) T δ l + 2 \delta^{l+1}=\Sigma'(z^{l+1})(w^{l+2})^T\delta^{l+2} δl+1=Σ′(zl+1)(wl+2)Tδl+2代入即可。
Proof of the four fundamental equations (optional)
- Prove Equations (BP33) and (BP44).
证明:(BP3)由链式求导
∂ C ∂ b j l = ∂ C ∂ a j l ∂ a j l ∂ z j l ∂ z j l ∂ b j l \frac{\partial C}{\partial b_j^l} = \frac{\partial C}{\partial a_j^l}\frac{\partial a_j^l}{\partial z_j^l}\frac{\partial z_j^l}{\partial b_j^l} ∂bjl∂C=∂ajl∂C∂zjl∂ajl∂bjl∂zjl
由于 ∂ C ∂ a j l ∂ a j l ∂ z j l = ∂ C ∂ a j l σ ′ ( z j l ) = δ j l \frac{\partial C}{\partial a_j^l}\frac{\partial a_j^l}{\partial z_j^l}=\frac{\partial C}{\partial a_j^l}\sigma '(z_j^l) = \delta_j^l ∂ajl∂C∂zjl∂ajl=∂ajl∂Cσ′(zj