如果激活函数使用sigmoid和tanh,怎最好使用xavir
tf.contrib.layers.xavier_initializer_conv2d
![](https://img-blog.csdn.net/20170814193302352?watermark/2/text/aHR0cDovL2Jsb2cuY3Nkbi5uZXQvbGl5YW9oaGg=/font/5a6L5L2T/fontsize/400/fill/I0JBQkFCMA==/dissolve/70/gravity/Center)
如果使用relu,则最好使用he initial
tf.contrib.layers.variance_scaling_initializer
如果激活函数使用sigmoid和tanh,怎最好使用xavir
tf.contrib.layers.xavier_initializer_conv2d
![](https://img-blog.csdn.net/20170814193302352?watermark/2/text/aHR0cDovL2Jsb2cuY3Nkbi5uZXQvbGl5YW9oaGg=/font/5a6L5L2T/fontsize/400/fill/I0JBQkFCMA==/dissolve/70/gravity/Center)
如果使用relu,则最好使用he initial
tf.contrib.layers.variance_scaling_initializer