Programming Exercise 1: Linear Regression.
0 Introduction
本片文章内容:
Coursera吴恩达机器学习课程,第二周编程作业。语言是Matlab。
2 Linear regression with one variable
Initialization:
%% Initialization
clear ; close all; clc
初始化部分:
clear 清除工作区的所有变量,还可以后面跟变量名来清除某个变量;
close all 关闭所有窗口(显示图像的figure窗口);
clc 清除命令窗口的内容(就是命令界面以前的命令)
Part 1: Basic Function
%% ==================== Part 1: Basic Function ====================
% Complete warmUpExercise.m
fprintf('Running warmUpExercise ... \n');
fprintf('5x5 Identity Matrix: \n');
warmUpExercise()
fprintf('Program paused. Press enter to continue.\n');
pause;
fprintf 将数据写入文本文件将多个数值和字面文本输出到屏幕。
\n 表示换行符,Matlab中字符串用单引号括起来。
pause 表示暂停。
warmUpExercise() 调用warmUpExercise函数,对应warmUpExercise.m,这个函数要求输出一个5*5的单位矩阵,直接使用eye函数。 函数在下面讲解。
Part 2: Plotting
%% ======================= Part 2: Plotting =======================
fprintf('Plotting Data ...\n')
data = load('ex1data1.txt');
X = data(:, 1); y = data(:, 2);
m = length(y); % number of training examples
% Plot Data
% Note: You have to complete the code in plotData.m
plotData(X, y);
fprintf('Program paused. Press enter to continue.\n');
pause;
load从文件读取数据;
调用plotData()函数画图,函数在下面讲解。
Part 3: Cost and Gradient descent
%% =================== Part 3: Cost and Gradient descent ===================
X = [ones(m, 1), data(:,1)]; % Add a column of ones to x
theta = zeros(2, 1); % initialize fitting parameters
% Some gradient descent settings
iterations = 1500;
alpha = 0.01;
fprintf('\nTesting the cost function ...\n')
% compute and display initial cost
J = computeCost(X, y, theta);
fprintf('With theta = [0 ; 0]\nCost computed = %f\n', J);
fprintf('Expected cost value (approx) 32.07\n');
% further testing of the cost function
J = computeCost(X, y, [-1 ; 2]);
fprintf('\nWith theta = [-1 ; 2]\nCost computed = %f\n', J);
fprintf('Expected cost value (approx) 54.24\n');
fprintf('Program paused. Press enter to continue.\n');
pause;
fprintf('\nRunning Gradient Descent ...\n')
% run gradient descent
theta = gradientDescent(X, y, theta, alpha, iterations);
% print theta to screen
fprintf('Theta found by gradient descent:\n');
fprintf('%f\n', theta);
fprintf('Expected theta values (approx)\n');
fprintf(' -3.6303\n 1.1664\n\n');
% Plot the linear fit
hold on; % keep previous plot visible
plot(X(:,2), X*theta, '-')
legend('Training data', 'Linear regression')
hold off % don't overlay any more plots on this figure
% Predict values for population sizes of 35,000 and 70,000
predict1 = [1, 3.5] *theta;
fprintf('For population = 35,000, we predict a profit of %f\n',...
predict1*10000);
predict2 = [1, 7] * theta;
fprintf('For population = 70,000, we predict a profit of %f\n',...
predict2*10000);
fprintf('Program paused. Press enter to continue.\n');
pause;
这一部分主要用来计算梯度和代价,并在figure中画出求出的the hypothesis hθ(x)。
调用computeCost(X, y, theta)用来计算代价,函数在下面讲解。
the cost function:
where the hypothesis hθ(x) is given by the linear model:
调用 gradientDescent(X, y, theta, alpha, iterations)函数,用来计算梯度。 函数在下面讲解。
In batch gradient descent, each iteration performs the update: