ex2 machine learning 整作业解析-1-Logistic Regression

ex2 machine learning 解析-1-Logistic Regression

ex2.m

1.照旧

clear ; close all; clc
clc清屏,clear清除变量,close all关闭画布

2.录入数据

data = load(‘ex2data1.txt’);
X = data(:, [1, 2]); y = data(:, 3);
观察一下ex2data1.txt是什么
在这里插入图片描述
发现是一个三列的表格
(:, [1,2])表示,行:所有行;列:取第一、二列

3.PlotData()

目标结果是
在这里插入图片描述
打开PlotData().m文件
pos = find(y == 1); neg = find(y==0);
目的是用pos储存所有y 为 1的下标(在y这个向量中),同理neg存y为0
#并不用画出y,y只有两个值,0 或1;所以将x在图中画出即可;展示y的方法是,将y ==1、0对应的点,用不同的符号标出来。且,y的坐标即x的坐标
这样就很方便了,分开画,先把y为1对应的x用’k+'标出,反之,‘ko’,也就是:
plot(X(pos, 1), X(pos, 2), ‘k+’,‘LineWidth’, 2,‘MarkerSize’, 7);
plot(X(neg, 1), X(neg, 2), ‘ko’, ‘MarkerFaceColor’, ‘y’, ‘MarkerSize’, 7);

X(pos, 1), X(pos, 2):分别表示横纵坐标;行:取pos中包含的行数,列:x为对应行的第一列,y为第二列
hold on
改x y轴的名字等:
xlabel(‘Exam 1 score’)
ylabel(‘Exam 2 score’)

% Specified in plot order
在这里插入图片描述
legend(‘Admitted’, ‘Not admitted’)
hold off;
附代码:

function plotData(X, y)
%PLOTDATA Plots the data points X and y into a new figure 
%   PLOTDATA(x,y) plots the data points with + for the positive examples
%   and o for the negative examples. X is assumed to be a Mx2 matrix.

% Create New Figure
figure; hold on;

% ====================== YOUR CODE HERE ======================
% Instructions: Plot the positive and negative examples on a
%               2D plot, using the option 'k+' for the positive
%               examples and 'ko' for the negative examples.
%
% Find Indices of Positive and Negative Examples
pos = find(y==1); 
neg = find(y == 0);
% Plot Examples
plot(X(pos, 1), X(pos, 2), 'k+','LineWidth', 2,'MarkerSize', 7);
plot(X(neg, 1), X(neg, 2), 'ko', 'MarkerFaceColor', 'y', 'MarkerSize', 7);


% =========================================================================



hold off;

end

4.ComputeCost()

初始化:
[m, n] = size(X); : m是X的行数,y是列数,即,2。

X = [ones(m, 1) X]; :在X的左边加一列1;ones(m,1):m行1列的1向量

initial_theta = zeros(n + 1, 1); :将theta的初始值为全0
行数为什么是n+1?
X是m行n+1列的矩阵,y是m行1列的向量;X * theta = predictions(形似y)
[m, n+1] * [n+1, 1] = [m, 1] —>theta是n+1行1列

计算cost:
function [J, grad] = costFunction(theta, X, y)

J

J是一个数,是这个theta的cost
在这里插入图片描述
在LR中h(x) = theta * X -->的到一个[n,1]的向量,而在分类中,
h = sigmoid(X * theta);
也就是说
J = -1 * y’ *log(h) - (1 - y)’ * log(1 - h);
J = J / m;
#J是一个数,y是[n,1], log(h)是[n,1],要得到一个数,即y和log(h)的内积–>y’ *log(h)

grad

grad是一个n+1的向量在这里插入图片描述同时J的偏导数又是这个在这里插入图片描述
也就是theta要减去一个和他相同形状的向量
h - y是m的列向量(因为有m组数据,即m行),x是[m ,n+1]的矩阵 所以
(h - y)’ * X:[m,1]’–[1,m] *[m,n+1] -->[1,n+1]的行向量
所以最后要再转置一下
最终:grad = (1 / m * (h - y)’ * X)’;

附costFunction.m的代码

function [J, grad] = costFunction(theta, X, y)
%COSTFUNCTION Compute cost and gradient for logistic regression
%   J = COSTFUNCTION(theta, X, y) computes the cost of using theta as the
%   parameter for logistic regression and the gradient of the cost
%   w.r.t. to the parameters.

% Initialize some useful values
m = length(y); % number of training examples

% You need to return the following variables correctly 
J = 0;
grad = zeros(size(theta));

% ====================== YOUR CODE HERE ======================
% Instructions: Compute the cost of a particular choice of theta.
%               You should set J to the cost.
%               Compute the partial derivatives and set grad to the partial
%               derivatives of the cost w.r.t. each parameter in theta
%
% Note: grad should have the same dimensions as theta
%
h = sigmoid(X * theta);
J = -1 * y' *log(h) - (1 - y)' * log(1 - h);
J = J / m;

	grad = (1 / m * (h - y)' * X)';

sifmoid的代码

function g = sigmoid(z)
%SIGMOID Compute sigmoid function
%   g = SIGMOID(z) computes the sigmoid of z.

% You need to return the following variables correctly 
g = zeros(size(z));

% ====================== YOUR CODE HERE ======================
% Instructions: Compute the sigmoid of each value of z (z can be a matrix,
%               vector or scalar).
for i = 1:size(z,1)
	for j = 1:size(z,2)
	g(i,j) = 1/(1+exp(-1*z(i,j)));
	end
end
% =============================================================

end

5.Optimization using fminunc

将写好的costFunction传入fminunc中
set options
options = optimset(‘GradObj’, ‘on’, ‘MaxIter’, 400);
[theta, cost] = fminunc(@(t)(costFunction(t, X, y)), initial_theta, options);

6.plotDecisionBoundary(theta, X, y);

在这里插入图片描述
plot_x = [min(X(:,2))-2, max(X(:,2))+2];
min(X(:,2)):(所有行)第二列中的最小值;max同理
plot_y = (-1./theta(3)).*(theta(2).*plot_x + theta(1));
引用:https://www.cnblogs.com/tornadomeet/archive/2013/03/16/2963919.html

直接令logistic回归的值为0.5,则可以得到e的指数为0,即:
theta(1)*1+theta(2)*plot_x+theta(3)*plot_y=0,解出plot_y即可。
因为分界线一边是y=1,一边是y=0,反映到sigmoid函数里就是e的指数是0
因为若小于0,则输出0,大于0则输出1
theta(1)*1+theta(2)*plot_x+theta(3)*plot_y
解释:
相当于a + bx +cy = 0
这里的x是X的第二列,y是第三列(这里要求解的)

function plotDecisionBoundary(theta, X, y)
%PLOTDECISIONBOUNDARY Plots the data points X and y into a new figure with
%the decision boundary defined by theta
%   PLOTDECISIONBOUNDARY(theta, X,y) plots the data points with + for the 
%   positive examples and o for the negative examples. X is assumed to be 
%   a either 
%   1) Mx3 matrix, where the first column is an all-ones column for the 
%      intercept.
%   2) MxN, N>3 matrix, where the first column is all-ones

% Plot Data
plotData(X(:,2:3), y);
hold on

if size(X, 2) <= 3
    % Only need 2 points to define a line, so choose two endpoints
    plot_x = [min(X(:,2))-2,  max(X(:,2))+2];

    % Calculate the decision boundary line
    plot_y = (-1./theta(3)).*(theta(2).*plot_x + theta(1));

    % Plot, and adjust axes for better viewing
    plot(plot_x, plot_y)
    
    % Legend, specific for the exercise
    legend('Admitted', 'Not admitted', 'Decision Boundary')
    axis([30, 100, 30, 100])

这是这部分用到的
下面是未用到的(两部分加起来才是plotBoundary)

else
    % Here is the grid range
    u = linspace(-1, 1.5, 50);
    v = linspace(-1, 1.5, 50);

    z = zeros(length(u), length(v));
    % Evaluate z = theta*x over the grid
    for i = 1:length(u)
        for j = 1:length(v)
            z(i,j) = mapFeature(u(i), v(j))*theta;
        end
    end
    z = z'; % important to transpose z before calling contour

    % Plot z = 0
    % Notice you need to specify the range [0, 0]
    contour(u, v, z, [0, 0], 'LineWidth', 2)
end
hold off

end
  • 2
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值