![](https://img-blog.csdnimg.cn/20201014180756922.png?x-oss-process=image/resize,m_fixed,h_64,w_64)
maching learning
不会游泳的海豚
这个作者很懒,什么都没留下…
展开
-
欢迎使用CSDN-markdown编辑器
Use Gradient descent program Linear Regression with One Variable转载 2017-04-09 16:50:34 · 136 阅读 · 0 评论 -
机器学习笔记2---model representation
h--hypothesis,maps from x's to y's.转载 2017-04-02 21:03:38 · 293 阅读 · 0 评论 -
机器学习笔记3---cost function
Hypothesis:Patametersand the cost function is ,and this function is otherwise called the"Squared error function",or "Mean squared error"it can measure the accuracy of our hypothesi转载 2017-04-02 21:10:18 · 199 阅读 · 0 评论 -
机器学习笔记4---Gradient descent
what is Gradient descentand in octave ,you can use theta = theta - alpha*(((X*theta-y)'*X)./m)';and theta is a colume vector ,X is [x1 ,x2,x3...] y is a colume vector转载 2017-04-02 21:45:40 · 239 阅读 · 0 评论 -
机器学习笔记5---Gradient descent for linear regression
for linear regression has only one global, and no other localand in octave ,you can use theta = theta - alpha*(((X*theta-y)'*X)./m)';and theta is a colume vector ,X is [x1 ,x2,x3...]转载 2017-04-02 21:54:48 · 233 阅读 · 0 评论