问题:在深度学习训练中,之前的cost是正常的,突然在某一个batch训练中出现Nan。
网络搜索的资料:
1. How to avoid that Theano computing gradient going toward NaN https://stackoverflow.com/questions/40405334/how-to-avoid-that-theano-computing-gradient-going-toward-nan
2. 训练深度学习网络时候,出现Nan是什么原因,怎么才能避免? https://www.zhihu.com/question/49346370
3. Theano调试技巧 https://zhuanlan.zhihu.com/p/24857032
其实1中的说法挺好的:
few advises to avoid this problem