学到了第二门课的第三周
编程作业写到了第二门课的第一周--initalization
-
不同的初始化方法可能导致性能最终不同
-
随机初始化有助于打破对称,使得不同隐藏层的单元可以学习到不同的参数。
-
初始化时,初始值不宜过大。
-
He初始化搭配ReLU激活函数常常可以得到不错的效果。
dropout避免过拟合,在训练集中使用dropout,而不在测试集中。。。。
Note that regularization hurts training set performance! This is because it limits the ability of the network to overfit to the training set. But since it ultimately gives better test accuracy, it is helping your system.
**What we want you to remember from this notebook**: - Regularization will help you reduce overfitting. - Regularization will drive your weights to lower values. - L2 regularization and Dropout are two very effective regularization techniques.