2. Gradient Clipping for Gradient Exploding

  1. Gradient Exploding:(May be one of reasons for nan problems)
    When parameters approch a cliff region, the gradient update step can move the learner towards a very bad configuration (Loss Divergence)

  2. Gradient Clipping: Constrain gradient values within a range
      To address the presence of cliffs, a useful heuristic is to clip the magnitude of the gradient: Only keep its direction if its magnitude (like the norm of the gradient) is below a threshold (This is a Hyperparameter).
      For example, we pre-specify the range of the norm of gradient as [0, 20].
       - if ∣ g t ∣ > 20 |g_t| > 20 gt>20, then assign ∣ g t ∣ = 20 |g_t|=20 gt=20 by divided by some scalar.
       - if ∣ g t ∣ = 20 |g_t|=20 gt=20, directly use the gradient

Bold line is the update without clipping, which causes the divergence problem
Dash line is the update with clipping.
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值