Batch Normalization -- 解决梯度消失问题
Batch Normalization导读:
https://blog.csdn.net/malefactor/article/details/51476961
Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift:google 论文
https://arxiv.org/pdf/1502.03167.pdf
《Batch Normalization Accelerating Deep Network Training by Reducing Internal Covariate Shift》阅读笔记与实现:
https://blog.csdn.net/happynear/article/details/44238541
如何解决梯度消失、梯度爆炸:
1、对于RNN,可以通过梯度截断,避免梯度爆炸
2、可以通过添加正则项,避免梯度爆炸
3、使用LSTM等自循环和门控制机制,避免梯度消失,参考:https://www.cnblogs.com/pinking/p/9362966.html
4、优化激活函数,譬如将sigmold改为relu,避免梯度消失