Machine Learning -- Linear Regression with Multiple Variables(Andrew Ng)

Machine Learning – Linear Regression with Multiple Variables(Andrew Ng)

1. Feature Scaling

Idea: Make sure features are on a similar scale.

Example: x1 ~ [0,2000]; x2 ~ [1,5];

If you use these two features without scaling, you will get very skewed elliptical shapes. These tall and skinny ellipses, can form the contours of the cost function J(theta). And if you ran gradient descents on this cost function, your gradients may end up taking a long time and oscillate back and forth.

Methods:

  1. x = x / max(x);
  2. x = (x- mean(x)) / std(x); mean normalization

2. Difference between Gradient Descent and Normal Equation

Gradient DescentNormal Equation
need to choose alphano need to choose alpha
needs many iterationdon’t need iteration
work well when n (# features) is largeneed to compute inv(X’*X)
slow if n is large

3. Code

  1. Computing Cost function
J = (0.5/m) * (y-X*theta)'*(y-X*theta);
  1. Gradient Descent
theta = theta + (alpha/m)*(X'*(y - X*theta));
  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值