1.数据图形化(Plotting the Data)
在进行任务之前,将数据可视化对我们理解数据很有用。
1)载入数据
函数: load;length
data = load('ex1data1.txt'); % read comma separated data
X = data(:, 1); y = data(:, 2);
m = length(y); % number of training examples
2)可视化数据:plotData.m
函数:figure;plot;title;xlabel;ylabel
figure
plot(x,y,'Color','g','LineStyle','none','Marker','x','MarkerSize',10); % Plot the data
title('plotData');
xlabel('Population of City in 10,000s'); %Set the x-axis label
ylabel('Profit in $10,000s'); % Set the y-axis label
2.梯度下降(Gradient Descent)
1)准备参数
函数:ones;zeros;
X = [ones(m, 1), data(:,1)]; % Add a column of ones to x
theta = zeros(2, 1); % initialize fitting parameters
iterations = 1500;
alpha = 0.01;
2)计算代价函数(Computing the cost):computeCost.m
函数:sum
运算符: . ^
function J = computeCost(X,y,theta)
%myFun - Description
% Long description
% square every elements in matrix A : A .^2
m = length(y); % number of training examples
h=X*theta;
J=1/(2*m)*sum((h-y).^2);
end
3.梯度下降
gradientDescent.m
% for i=1:10,
% v(i)=2^i;
% end;
function [theta,J_history] = gradientDescent(X,y,theta,alpha,iterations)
m=length(y);
theta_s=theta;
for i=1:iterations,
theta(1)=theta(1)-alpha/m*sum(X*theta_s-y);
theta(2)=theta(2)-alpha/m*sum((X*theta_s-y).*X(:,2));
theta_s=theta;
% Save the cost in every iteration
J_history(iterations)=computeCost(X,y,theta);
end
J_history
end
说明:用theta_s保留计算结果;分别计算theta(1)和theta(2),原因是theta为两行向量,而X和y都是m行矩阵,维度不匹配。
4.调试
1)matlab向量编号是从1开始的;
2)运算时注意维度的匹配。