stanford machine learning 笔记
梯度下降1、梯度下降最好是实现同步梯度下降,异步梯度下降的结果比较奇怪,但也可能有效;2、If α is too small, gradient descent can be slow. If α is too large, gradient descent can overshoot the minimum. It may fail toconverge, or eve
原创
2013-11-06 21:30:42 ·
1104 阅读 ·
0 评论