进阶简单的多元线性回归(Octave & MATLAB versions)

1.主函数
data = load(‘ex1data2.txt’);
X = data(:, 1:2);
y = data(:, 3);
m = length(y);
[X, mu, sigma] = featureNormalize(X);
X = [ones(m, 1) X];
theta=zeros(3,1);
alpha = [0.03 0.1 0.3];
num_iters = 50;
fprintf(‘Selecting,learning rates:alpha.\n’)
pause
[theta,J1] = gradientDescentMulti(X, y, theta, alpha(1), num_iters);% Selecting learning rates
[theta,J2] = gradientDescentMulti(X, y, theta, alpha(2), num_iters);% Selecting learning rates
[theta,J3] = gradientDescentMulti(X, y, theta, alpha(3), num_iters);% Selecting learning rates
plot(1:numel(J1),J1,‘g’,‘linewidth’,2);
xlabel(‘Number of iterations 50’);
ylabel(‘Costfunc J (\theta)’);
figure
plot(1:numel(J2), J2, ‘:r’,‘linewidth’,2);
xlabel(‘Number of iterations 50’);
ylabel(‘Costfunc J (\theta)’);
figure
plot(1:numel(J3), J3, ‘–k’,‘linewidth’,2);
xlabel(‘Number of iterations 50’);
ylabel(‘Costfunc J (\theta)’);
fprintf(‘Normal Equations method.\n’);
pause
[NEtheta] = normalEqn(X, y);% Normal Equations SOLUTION
在这里插入图片描述
在这里插入图片描述
在这里插入图片描述
2.FUNCTIONS HANDLES
(1)Normalize
function [X_norm, mu, sigma] = featureNormalize(X)
%FEATURENORMALIZE Normalizes the features in X
% FEATURENORMALIZE(X) returns a normalized version of X where
X_norm = X;
mu = zeros(1, size(X, 2));
sigma = zeros(1, size(X, 2));
for i = 1:size(X,2)
mu(i)=mean(X(:,i));
sigma(i)=std(X(:,i));
end
X_norm=(X-mu)./sigma;

end
(2)gradientDescent
function [theta, J_history] = gradientDescentMulti(X, y, theta, alpha, num_iters)
m = length(y); % number of training examples
J_history = zeros(num_iters, 1);

for iter = 1:num_iters
H=Xtheta;
theta=theta-alpha/m
X’*(H-y);
J_history(iter)=computeCostMulti(X,y,theta);

% Save the cost J in every iteration    
J_history(iter) = computeCostMulti(X, y, theta);

end

end
(3)Cost Func
function [J] = computeCostMulti(X, y, theta)
%COMPUTECOSTMULTI Compute cost for linear regression with multiple variables
m = length(y); % number of training examples
J=0;
H=[length(y),1];
H=Xtheta;
J=1/2
m*sum((H-y).^2);

end
(4)ANOTHER METHOD
function [theta] = normalEqn(X, y)
%NORMALEQN Computes the closed-form solution to linear regression

theta = zeros(size(X, 2), 1);
theta=pinv(X’*X)*X’*y;
end

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值