网易云课堂 Machine Learning 编程作业 1:liner regression

网易云课堂 Machine Learning 编程作业 1:liner regression

一.单变量线性回归

1. warmUpExercise.m:Output the 5 by 5 identity matrix

A = eye(5);

2.plotData.m:plot the data

figure; 
plot(x, y, 'rx', 'MarkerSize', 10); 
ylabel('Profit in $10,000s'); 
xlabel('Population of City in 10,000s'); 

3.computeCost.m

J ( θ ) = 1 2 m ∑ i = 1 m ( h θ ( x ( i ) ) − y ( i ) ) 2 J(\theta)=\frac{1}{2 m} \sum_{i=1}^{m}\left(h_{\theta}\left(x^{(i)}\right)-y^{(i)}\right)^{2} J(θ)=2m1i=1m(hθ(x(i))y(i))2

J = sum((X * theta - y).^2) / (2*m); 

4 .gradientDescent.m:梯度下降法,对参数进行迭代

θ j : = θ j − α 1 m ∑ i = 1 m ( h θ ( x ( i ) ) − y ( i ) ) x j ( i ) , ( j = 0 , 1 ) \theta_{j} :=\theta_{j}-\alpha \frac{1}{m} \sum_{i=1}^{m}\left(h_{\theta}\left(x^{(i)}\right)-y^{(i)}\right) x_{j}^{(i)},(j=0,1) θj:=θjαm1i=1m(hθ(x(i))y(i))xj(i),(j=0,1)

theta(1) = theta(1) - alpha / m * sum(X * theta_s - y);       
theta(2) = theta(2) - alpha / m * sum((X * theta_s - y) .* X(:,2));  
theta_s=theta; 

二 .多变量线性回归

1. featureNormalize.m:数据归一化

z = x − μ σ z=\frac{x-\mu}{\sigma} z=σxμ

 mu = mean(X);       %  mean value 
 sigma = std(X);     %  standard deviation
 X_norm  = (X - repmat(mu,size(X,1),1)) ./  repmat(sigma,size(X,1),1);

2.computeCostMulti.m:计算代价

J = sum((X * theta - y).^2) / (2*m); 

3.gradientDescentMulti.m:计算梯度

∂ ∂ θ j J ( θ ) = 1 2 m [ 2 ( h θ ( x ( 1 ) ) − y ( 1 ) ) x j ( 1 ) + 2 ( h θ ( x ( 2 ) ) − y ( 2 ) ) x j ( 2 ) + … ] = 1 m ∑ i = 1 m ( h θ ( x ( i ) ) − y ( i ) ) x j ( i ) \begin{aligned} \frac{\partial}{\partial \theta_{j}} J(\theta) &=\frac{1}{2 m}\left[2\left(h_{\theta}\left(x^{(1)}\right)-y^{(1)}\right) x_{j}^{(1)}+2\left(h_{\theta}\left(x^{(2)}\right)-y^{(2)}\right) x_{j}^{(2)}+\ldots\right] \\ &=\frac{1}{m} \sum_{i=1}^{m}\left(h_{\theta}\left(x^{(i)}\right)-y^{(i)}\right) x_{j}^{(i)} \end{aligned} θjJ(θ)=2m1[2(hθ(x(1))y(1))xj(1)+2(hθ(x(2))y(2))xj(2)+]=m1i=1m(hθ(x(i))y(i))xj(i)

θ j : = θ j − α 1 m ∑ i = 1 m ( h θ ( x ( i ) ) − y ( i ) ) x j ( i ) , ( j = 0 , 1 ) \theta_{j} :=\theta_{j}-\alpha \frac{1}{m} \sum_{i=1}^{m}\left(h_{\theta}\left(x^{(i)}\right)-y^{(i)}\right) x_{j}^{(i)},(j=0,1) θj:=θjαm1i=1m(hθ(x(i))y(i))xj(i),(j=0,1)

不适用循环,直接使用矩阵乘法

θ = θ − α 1 m X T ( X θ − Y ) \theta=\theta-\alpha \frac{1}{m} X^{T}(X \theta-Y) θ=θαm1XT(XθY)

theta = theta - alpha / m * X' * (X * theta - y); 

4.根据迭代n次后得到的θ \thetaθ,计算1650平方英尺、3个卧室的房子价钱

price = [1 (([1650 3]-mu) ./ sigma)] * theta ;

三.使用正规方程进行求解

θ = ( X T X ) − 1 X T Y \theta=\left(X^{T} X\right)^{-1} X^{T} Y θ=(XTX)1XTY

theta = pinv( X' * X ) * X' * y;

第一次使用markdown编辑,还有很多需要改进和学习的地方

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 1
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值