作业一:推导交叉熵 loss 下的 Softmax 梯度
Created: March 18, 2022 1:19 PM
L = − log e s k ∑ j e s j L = - \log \frac{e^{s_k}}{\sum_j e^{s_j}} L=−log∑jesjesk,求 ∂ L ∂ e s i \frac{\partial L}{\partial e^{s_i}} ∂esi∂L 。
定义 p i = e s i ∑ j e s j p_i = \frac{e^{s_i}}{\sum_j e^{s_j}} pi=∑jesjesi
-
i = k i = k i=k:
∂ L ∂ e s i = − 1 p k ∂ p k ∂ s k = − 1 p k e s k ⋅ ∑ j e s j − e s k ⋅ e s k ( ∑ j e s j ) 2 = − 1 p k e s k ∑ j e s j ∑ j ≠ k e s j ∑ j e s j = − 1 p k e s k ∑ j e s j ( 1 − e s k ∑ j e s j ) = − 1 p k p k ( 1 − p k ) = p k − 1 \frac{\partial L}{\partial e^{s_i}} =- \frac{1}{p_k} \frac{\partial p_k}{\partial s_k} \\ =- \frac{1}{p_k} \frac{e^{s_k} \cdot \sum_j e^{s_j} - e^{s_k} \cdot e^{s_k}}{(\sum_j e^{s_j})^2} \\ =- \frac{1}{p_k} \frac{e^{s_k}}{\sum_j e^{s_j}} \frac{\sum_{j \neq k}e^{s_j}}{\sum_j e^{s_j}} \\ =- \frac{1}{p_k} \frac{e^{s_k}}{\sum_j e^{s_j}}(1-\frac{e^{s_k}}{\sum_j e^{s_j}}) \\ =- \frac{1}{p_k} p_k (1-p_k) \\ = p_k - 1 ∂esi∂L=−pk1∂sk∂pk=−pk1(∑jesj)2esk⋅∑jesj−esk⋅esk=−pk1∑jesjesk∑jesj∑j=kesj=−pk1∑jesjesk(1−∑jesjesk)=−pk1pk(1−pk)=pk−1
-
i ≠ k i \neq k i=k
∂ L ∂ e s i = − 1 p i ∂ p i ∂ s i = − 1 p k − e s k e s i ( ∑ j e s j ) 2 = 1 p k e s k ∑ j e s j e s i ∑ j e s j = 1 p k ⋅ p k ⋅ p i = p i \frac{\partial L}{\partial e^{s_i}} =- \frac{1}{p_i} \frac{\partial p_i}{\partial s_i} \\=- \frac{1}{p_k} \frac{-e^{s_k} e^{s_i}}{(\sum_j e^{s_j})^2} \\=\frac{1}{p_k} \frac{e^{s_k}}{\sum_j e^{s_j}} \frac{e^{s_i}}{\sum_j e^{s_j}} \\=\frac{1}{p_k} \cdot p_k \cdot p_i \\= p_i ∂esi∂L=−pi1∂si∂pi=−pk1(∑jesj)2−eskesi=pk1∑jesjesk∑jesjesi=pk1⋅pk⋅pi=pi -
总结
∂ L ∂ e s i = { p k − 1 , i = k p i , i ≠ k \frac{\partial L}{\partial e^{s_i}} =\begin{cases}p_k - 1 , \quad i = k \\p_i , \quad i \neq k\end{cases} ∂esi∂L={pk−1,i=kpi,i=k