机器学习-算法
文章平均质量分 65
zhichengMLE
Machine Learning
展开
-
Get More Data
Get More Data1. Why We Need More Data?In many situations (low bias learning model), more data usually means better performance of the model.2. When We Need More Data?Usually, we should plot the learnin原创 2017-11-30 21:29:33 · 598 阅读 · 0 评论 -
Batch Gradient Descent
Batch Gradient Descent We use linear regression as example to explain this optimization algorithm.1. Formula1.1. Cost Function We prefer residual sum of squared to evaluate linear regression.J(θ)原创 2017-11-30 21:32:34 · 666 阅读 · 0 评论 -
Stochastic Gradient Descent
Stochastic Gradient Descent1. What is Stochastic Gradient DescentStochastic Gradient Descent(SGD) is similiar with Batch Gradient Desent, but it used only 1 example for each iteration. So that it makes原创 2017-12-12 01:36:48 · 1410 阅读 · 0 评论 -
Mini-Batch Gradient Descent
Mini-Batch Gradient Descent1. What is Mini-Batch Gradient Descent?Mini-Batch Gradient Descent is an algorithm between the Batch Gradient Descent and Stochastic Gradient Descent. Concretly, this use som原创 2017-12-12 01:37:16 · 857 阅读 · 0 评论