# Exercise: Linear Regression

﻿﻿
Exercise: Linear Regression

step1：数据准备。

figure % open a new figure window
plot(x, y, 'o');
ylabel('Height in meters')
xlabel('Age in years')

Hθ(x) = θ0 + θ1x

m = length(y); % store the number of training examples
x = [ones(m, 1), x]; % Add a column of ones to x

step2：线性回归

1、使用步长为0.07的学习率来进行梯度下降。权重向量θ={θ01}初始化为0。计算一次迭代后的权重向量θ的值。

>> theta=zeros(1,2)
theta =     0     0

delta=(x*theta')-y;
sum=delta'*x;
delta_theta=sum*0.07/m;
theta1=theta - delta_theta;

theta1 =    0.0745    0.3800

function [ result ] = gradientDescent( x,y,alpha,accuracy )
%GRADIENTDESCENT Summary of this function goes here
%   Detailed explanation goes here
orgX=x;
plot(orgX, y, 'o');
ylabel('Height in meters')
xlabel('Age in years')
m=length(y);
x=[ones(m,1),x];
theta = zeros(1,2);
hold on;
times = 0;
while(1)
times = times + 1;
delta=(x*theta')-y;
sum=delta'*x;
delta_theta=sum*alpha/m;
theta1=theta - delta_theta;
if(all(abs(theta1(:) - theta(:)) < accuracy))
result = theta;
break;
end
theta = theta1;
if(mod(times,100) == 0)
plot(x(:,2), x*theta', '-','color',[(mod(times/100,256))/256 128/256 2/256]); % remember that x is now a matrix with 2 columns
% and the second column contains the time info
legend('Training data', 'Linear regression');
end
end

end

function [ result ] = gradientDescent( x,y,alpha,accuracy )
%GRADIENTDESCENT Summary of this function goes here
%   Detailed explanation goes here
orgX=x;
plot(orgX, y, 'o');
ylabel('Height in meters')
xlabel('Age in years')
m=length(y);
x=[ones(m,1),x];
theta = zeros(1,2);
hold on;
while(1)
delta=(x*theta')-y;
sum=delta'*x;
delta_theta=sum*alpha/m;
theta1=theta - delta_theta;
if(all(abs(theta1(:) - theta(:)) < accuracy))
result = theta;
plot(x(:,2), x*theta', '-','color',[200/256 128/256 2/256]); % remember that x is now a matrix with 2 columns
% and the second column contains the time info
legend('Training data', 'Linear regression');
break;
end
theta = theta1;
end

end

theta =    0.7502    0.0639

3、现在，我们已经训练出了权重向量theta的值，我们可以用它来预测一下别的数据。

predict the height for a two boys of age 3.5 and age 7

>> boys=[3.5;7];
>> boys=[ones(2,1),boys];
>> heights=boys*theta';

heights =

0.9737
1.1973

4、理解J(θ)

J(θ)是与权重向量相关的cost function。权重向量的选取，会影响cost的大小。我们希望cost可以尽量小，这样相当于寻找cost function的最小值。查找一个函数的最小值有一个简单的方法，就是对该函数求导（假设该函数可导），然后取导数为0时的点，该点即为函数的最小值（也可能是极小值）。梯度下降就是对J(θ)求偏导后，一步一步逼近导数为0的点的方法。

function [] = showJTheta( x,y )
%SHOW Summary of this function goes here
%   Detailed explanation goes here
J_vals = zeros(100, 100);   % initialize Jvals to 100x100 matrix of 0's
theta0_vals = linspace(-3, 3, 100);
theta1_vals = linspace(-1, 1, 100);
for i = 1:length(theta0_vals)
for j = 1:length(theta1_vals)
t = [theta0_vals(i); theta1_vals(j)];
J_vals(i,j) = sum(sum((x*t - y).^2,2),1)/(2*length(y));
end
end

% Plot the surface plot
% Because of the way meshgrids work in the surf command, we need to
% transpose J_vals before calling surf, or else the axes will be flipped
J_vals = J_vals'
figure;
surf(theta0_vals, theta1_vals, J_vals);
axis([-3, 3, -1, 1,0,40]);
xlabel('\theta_0'); ylabel('\theta_1')

end

x=[ones(length(y),1),x];
showJTheta(x,y);

figure;
% Plot the cost function with 15 contours spaced logarithmically
% between 0.01 and 100
contour(theta0_vals, theta1_vals, J_vals, logspace(-2, 2, 15));
xlabel('\theta_0'); ylabel('\theta_1');

• 本文已收录于以下专栏：

## Programming Exercise5:Regularized Linera Regression and Bias v.s Variance

• a1015553840
• 2016年03月01日 21:27
• 1347

## 斯坦福大学机器学习公开课---Programming Exercise 1: Linear Regression

• E_pen
• 2015年02月03日 23:10
• 3299

## [Exercise 1] Linear Regression

• sherlockzoom
• 2015年07月10日 15:41
• 511

## Andrew Ng机器学习week6(Regularized Linear Regression and Bias/Variance)编程习题

Andrew Ng机器学习week6(Regularized Linear Regression and Bias/Variance)编程习题linearRegCostFunction.mfuncti...
• u010043538
• 2017年06月02日 00:30
• 718

## Stanford 机器学习 Week6 作业:Regularized Linear Regression and Bias v.s. Variance

linearRegCostfunctionm = length(y); J = 0; grad = zeros(size(theta));J = 1.0 / 2 / m * ( sum( (X * ...
• Baoli1008
• 2016年03月04日 12:28
• 956

## coursera机器学习课程——Programming Exercise 1：Linear Regression

coursera机器学习课程的第一次编程作业的第一部分
• sinat_38930882
• 2017年09月11日 16:10
• 169

## Machine Learning week 6 quiz: programming assignment-Regularized Linear Regression and Bias/Variance

• GarfieldEr007
• 2015年11月26日 13:26
• 4907

## Multivariate Linear Regression Exercise

• u010457543
• 2015年08月27日 21:53
• 361

## Exercise 2: Linear Regression

Machine Learning Andrew Ng Solutions After you have completed the exercises above, please refer ...
• ytlcainiao
• 2015年03月29日 20:30
• 389

## Exercise: Linear Regression

﻿﻿ Exercise: Linear Regression This course consists of videos and programming exercises to teach y...
• starzhou
• 2016年05月30日 18:03
• 242

举报原因： 您举报文章：Exercise: Linear Regression 色情 政治 抄袭 广告 招聘 骂人 其他 (最多只允许输入30个字)