转行程序员1 机器学习 线性回归 Linear Regression 纯属敦促自己学习

主要学习了Andrew Ng的公开课 machine learning 之Linear Regression,其exercise网址如下

http://openclassroom.stanford.edu/MainFolder/DocumentPage.php?course=MachineLearning&doc=exercises/ex2/ex2.html


编写的代码如下:


%%%%%%%%%%%%%%%%%%%%%%%%Linear regression

clear all; close all; clc

x = load('ex2x.dat');
y = load('ex2y.dat');

figure % open a new figure window
plot(x, y, 'o');
ylabel('Height in meters')
xlabel('Age in years')
m = length(y); % store the number of training examples 样本数量
x = [ones(m, 1), x]; % Add a column of ones to x  

theta=zeros(2,1);
alpha=0.07;% learning rate

for inter=1:1500 % 迭代次数
    h_theta=x*theta;
    theta=theta-alpha/m*x'*(h_theta-y);
end
hold on % Plot new data without clearing old plot
plot(x(:,2), x*theta, '-') % remember that x is now a matrix with 2 columns
                           % and the second column contains the time info
legend('Training data', 'Linear regression')

%%%%%%%%%%%%%%%%%%%%%%%%  Understanding J(theta)
J_vals = zeros(100, 100);   % initialize Jvals to 100x100 matrix of 0's
theta0_vals = linspace(-3, 3, 100);
theta1_vals = linspace(-1, 1, 100);
for i = 1:length(theta0_vals)
      for j = 1:length(theta1_vals)
      t = [theta0_vals(i); theta1_vals(j)];
      J_vals(i,j) = 1/2/m*sum((theta0_vals(i)+theta1_vals(j)*x(:,2)-y(:)).^2);
    end
end

% Plot the surface plot
% Because of the way meshgrids work in the surf command, we need to
% transpose J_vals before calling surf, or else the axes will be flipped
J_vals = J_vals';
figure;
surf(theta0_vals, theta1_vals, J_vals)
xlabel('\theta_0'); ylabel('\theta_1')
% Plot the cost function with 15 contours spaced logarithmically
% between 0.01 and 100
figure;
contour(theta0_vals, theta1_vals, J_vals, logspace(-2, 2, 15))
xlabel('\theta_0'); ylabel('\theta_1')

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值