# Exercise: Linear Regression

﻿﻿
Exercise: Linear Regression

step1：数据准备。

figure % open a new figure window
plot(x, y, 'o');
ylabel('Height in meters')
xlabel('Age in years')

Hθ(x) = θ0 + θ1x

m = length(y); % store the number of training examples
x = [ones(m, 1), x]; % Add a column of ones to x

step2：线性回归

1、使用步长为0.07的学习率来进行梯度下降。权重向量θ={θ01}初始化为0。计算一次迭代后的权重向量θ的值。

>> theta=zeros(1,2)
theta =     0     0

delta=(x*theta')-y;
sum=delta'*x;
delta_theta=sum*0.07/m;
theta1=theta - delta_theta;

theta1 =    0.0745    0.3800

function [ result ] = gradientDescent( x,y,alpha,accuracy )
%GRADIENTDESCENT Summary of this function goes here
%   Detailed explanation goes here
orgX=x;
plot(orgX, y, 'o');
ylabel('Height in meters')
xlabel('Age in years')
m=length(y);
x=[ones(m,1),x];
theta = zeros(1,2);
hold on;
times = 0;
while(1)
times = times + 1;
delta=(x*theta')-y;
sum=delta'*x;
delta_theta=sum*alpha/m;
theta1=theta - delta_theta;
if(all(abs(theta1(:) - theta(:)) < accuracy))
result = theta;
break;
end
theta = theta1;
if(mod(times,100) == 0)
plot(x(:,2), x*theta', '-','color',[(mod(times/100,256))/256 128/256 2/256]); % remember that x is now a matrix with 2 columns
% and the second column contains the time info
legend('Training data', 'Linear regression');
end
end

end

function [ result ] = gradientDescent( x,y,alpha,accuracy )
%GRADIENTDESCENT Summary of this function goes here
%   Detailed explanation goes here
orgX=x;
plot(orgX, y, 'o');
ylabel('Height in meters')
xlabel('Age in years')
m=length(y);
x=[ones(m,1),x];
theta = zeros(1,2);
hold on;
while(1)
delta=(x*theta')-y;
sum=delta'*x;
delta_theta=sum*alpha/m;
theta1=theta - delta_theta;
if(all(abs(theta1(:) - theta(:)) < accuracy))
result = theta;
plot(x(:,2), x*theta', '-','color',[200/256 128/256 2/256]); % remember that x is now a matrix with 2 columns
% and the second column contains the time info
legend('Training data', 'Linear regression');
break;
end
theta = theta1;
end

end

theta =    0.7502    0.0639

3、现在，我们已经训练出了权重向量theta的值，我们可以用它来预测一下别的数据。

predict the height for a two boys of age 3.5 and age 7

>> boys=[3.5;7];
>> boys=[ones(2,1),boys];
>> heights=boys*theta';

heights =

0.9737
1.1973

4、理解J(θ)

J(θ)是与权重向量相关的cost function。权重向量的选取，会影响cost的大小。我们希望cost可以尽量小，这样相当于寻找cost function的最小值。查找一个函数的最小值有一个简单的方法，就是对该函数求导（假设该函数可导），然后取导数为0时的点，该点即为函数的最小值（也可能是极小值）。梯度下降就是对J(θ)求偏导后，一步一步逼近导数为0的点的方法。

function [] = showJTheta( x,y )
%SHOW Summary of this function goes here
%   Detailed explanation goes here
J_vals = zeros(100, 100);   % initialize Jvals to 100x100 matrix of 0's
theta0_vals = linspace(-3, 3, 100);
theta1_vals = linspace(-1, 1, 100);
for i = 1:length(theta0_vals)
for j = 1:length(theta1_vals)
t = [theta0_vals(i); theta1_vals(j)];
J_vals(i,j) = sum(sum((x*t - y).^2,2),1)/(2*length(y));
end
end

% Plot the surface plot
% Because of the way meshgrids work in the surf command, we need to
% transpose J_vals before calling surf, or else the axes will be flipped
J_vals = J_vals'
figure;
surf(theta0_vals, theta1_vals, J_vals);
axis([-3, 3, -1, 1,0,40]);
xlabel('\theta_0'); ylabel('\theta_1')

end

x=[ones(length(y),1),x];
showJTheta(x,y);

figure;
% Plot the cost function with 15 contours spaced logarithmically
% between 0.01 and 100
contour(theta0_vals, theta1_vals, J_vals, logspace(-2, 2, 15));
xlabel('\theta_0'); ylabel('\theta_1');

#### 斯坦福大学机器学习公开课---Programming Exercise 1: Linear Regression

2015-02-03 23:10:46

#### Programming Exercise 1: Linear Regression Machine Learning

2016-12-27 20:57:48

#### Coursera Machine Learning 第二周 quiz Programming Exercise 1: Linear Regression

2016-11-09 19:47:52

#### coursera机器学习课程——Programming Exercise 1：Linear Regression（选做部分）

2017-09-11 16:37:00

#### 斯坦福：机器学习CS229：Exercise 1: Linear Regression线性回归（答案1）

2017-05-18 21:02:45

#### Deep Learning Exercise: Linear Regression

2017-03-28 12:44:45

#### [Exercise 1] Linear Regression

2015-07-10 15:41:23

#### 1 Week One 1.1 Programming Exercise 1: Linear Regression

2018-03-26 14:49:37

#### Exercise #5

2016-03-01 14:57:48

#### Coursera Machine Learning Week 3 - Programming Exercise 2: Logistic Regression

2016-08-26 16:55:06