Layernorm反向梯度

\mu =\frac{1}{H}\sum_{i=1}^{H}x_{i}

\sigma ^{2}=\frac{1}{H}\sum_{i=1}^{H}\left ( x_{i}-\mu \right )^{2}

\hat{x}=\frac{x-\mu }{\sqrt{\sigma ^{2}+\varepsilon }}

y=g\odot \hat{x}+b     \odot表示element-wise乘法

则输入的梯度为:

\begin{align*} \frac{\partial L}{\partial x_{i}} &=\sum_{j=1}^{H}\frac{\partial L}{\partial y_{j}}\cdot \frac{\partial y_{j}}{\partial \hat{x}_{j}}\cdot \frac{\partial \hat{x}_{j}}{\partial x_{i}} \\ &= \sum_{j=1}^{H}\frac{\partial L}{\partial y_{j}}\cdot g_{j}\cdot \frac{\partial \hat{x}_{j}}{\partial x_{i}} \end{align*}

\begin{align*}\frac{\partial \hat{x_{j}}}{\partial x_{i}} &= \frac{\partial }{\partial x_{i}}\left ( \frac{x_{j}-\mu_{j} }{\sqrt{\sigma_{j} ^{2}-\varepsilon }} \right )\\ &=\frac{1}{\sqrt{\sigma_{j} ^{2}+\varepsilon }}\frac{\partial x_{j}}{\partial x_{i}}-\frac{1}{\sqrt{\sigma_{j} ^{2}+\varepsilon }}\frac{\partial \mu_{j} }{\partial x_{i}}\\ &+\left ( x_{j}-\mu_{j} \right )\cdot \left ( -\frac{1}{2} \right )\cdot \left ( \sigma_{j} ^{2}+\varepsilon \right )^{-\frac{3}{2}}\frac{\partial \sigma_{j} ^{2}}{\partial x_{i}} \end{align*}

当j=i时,

\frac{\partial x_{i}}{\partial x_{i}} =1

j≠i时,

 \frac{\partial x_{j}}{\partial x_{i}} =0

\frac{\partial \mu_{j} }{\partial x_{i}} = \frac{\partial}{\partial x_{i}}\left ( \frac{1}{H}\sum_{k=1}^{H}x_{k} \right )_{j}

虽然是求和,但是只有k=i时有值,所以:

\frac{\partial \mu_{j} }{\partial x_{i}} = \frac{1}{H}

同样地,虽然是求和,但是只有k=i时有值

\begin{align*}\frac{\partial \sigma_{j} ^{2}}{\partial x_{i}} &= \frac{\partial}{\partial x_{i}}\left [ \frac{1}{H}\sum_{k=1}^{H}\left ( x_{k}-\mu_{j} \right )^{2} \right ]\\ &=\frac{2}{H}\left ( x_{i}-\mu_{j} \right )+\frac{1}{H}\sum_{k=1}^{H}2\cdot \left ( x_{k}-\mu_{j} \right )\left (-\frac{\partial \mu_{j} }{\partial x_{i}} \right ) \\ &= \frac{2}{H}\left ( x_{i}-\mu_{j} \right )-\frac{1}{H}\sum_{k=1}^{H}2\cdot \left ( x_{k}-\mu_{j} \right ) \frac{\partial \mu_{j} }{\partial x_{i}} \end{align*}

所以:

\begin{align*} \frac{\partial L}{\partial x_{i}} &= \sum_{j=1}^{H}\frac{\partial L}{\partial y_{j}}\cdot g_{j}\cdot \frac{1}{\sqrt{\sigma ^{2}+\varepsilon }}\cdot \frac{\partial x_{j}}{\partial x_{i}}\\ &-\sum_{j=1}^{H}\frac{\partial L}{\partial y_{j}}\cdot g_{j}\cdot \frac{1}{\sqrt{\sigma ^{2}+\varepsilon }}\cdot \frac{1}{H} \\ &+\sum_{j=1}^{H}\frac{\partial L}{\partial y_{j}}\cdot g_{j}\cdot\left ( x_{j}-\mu_{j} \right )\cdot \left ( -\frac{1}{2} \right ) \left (\sigma_{j} ^{2}+\varepsilon \right ) ^{-\frac{3}{2}}\cdot\frac{2}{H}\left ( x_{i}-\mu _{j}\right )\\&-\frac{1}{H}\sum_{k=1}^{H}\left [\sum_{j=1}^{H}\frac{\partial L}{\partial y_{j}}\cdot g_{j}\cdot\left ( x_{j}-\mu_{j} \right )\cdot \left ( -\frac{1}{2} \right ) \left (\sigma_{j} ^{2}+\varepsilon \right ) ^{-\frac{3}{2}} \right ]\frac{2}{H}\left ( x_{k}-\mu_{j} \right ) \end{align*}

\begin{align*} \frac{\partial L}{\partial x_{i}} &= \left ( \sigma ^{2}+\varepsilon \right )^{-\frac{1}{2}}\frac{\partial L}{\partial y_{i}}\cdot g_{i}\\ &-\left ( \sigma ^{2}+\varepsilon \right )^{-\frac{1}{2}}\cdot \frac{1}{H} \cdot \sum_{j=1}^{H}\frac{\partial L}{\partial y_{j}}\cdot g_{j} \\ &-\left ( \sigma ^{2}+\varepsilon \right )^{-\frac{1}{2}}\cdot \frac{1}{H}\cdot \left ( x_{i}-\mu \right )\cdot \sum_{j=1}^{H}\frac{\partial L}{\partial y_{j}}\cdot g_{j}\cdot\frac{ x_{j}-\mu}{\sigma ^{2}+\varepsilon} \\&+\left ( \sigma ^{2}+\varepsilon \right )^{-\frac{1}{2}}\cdot\frac{1}{H}\cdot \sum_{k=1}^{H}\frac{1}{H}\cdot \left ( x_{k}-\mu \right )\cdot \left [\sum_{j=1}^{H}\frac{\partial L}{\partial y_{j}}\cdot g_{j}\cdot \frac{x_{j}-\mu}{\sigma ^{2}+\varepsilon} \right ] \end{align*}

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 2
    评论
评论 2
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值