>吴恩达机器学习课程链接
>课程总结和笔记链接
实验二的原始代码和使用数据可至课程链接-课时60-章节8编程作业中下载
Machine Learning Online Class - Exercise 2: Logistic Regression
包括逻辑回归的损失函数、梯度、自动优化、预测以及正则化后的损失函数、梯度等
环境——Matlab R2018b/Octave
一般Logistic Regression
Part 1: Plotting
plotData.m
二分类,在图上用不同的标记表示两类数据
function plotData(X, y)
%PLOTDATA Plots the data points X and y into a new figure
% PLOTDATA(x,y) plots the data points with + for the positive examples
% and o for the negative examples. X is assumed to be a Mx2 matrix.
% Create New Figure
figure; hold on;
% ====================== YOUR CODE HERE ======================
% Instructions: Plot the positive and negative examples on a
% 2D plot, using the option 'k+' for the positive
% examples and 'ko' for the negative examples.
%
positive = find(y == 1);
negative = find(y == 0);
plot(X(positive, 1), X(positive, 2), 'k+')
plot(X(negative, 1), X(negative, 2), 'ko', 'MarkerFaceColor', 'b')
% =========================================================================
hold off;
end
运行结果
Part 2: Compute Cost and Gradient
sigmoid.m
function g = sigmoid(z)
%SIGMOID Compute sigmoid function
% g = SIGMOID(z) computes the sigmoid of z.
% You need to return the following variables correctly
g = zeros(size(z));
% ====================== YOUR CODE HERE ======================
% Instructions: Compute the sigmoid of each value of z (z can be a matrix,
% vector or scalar).
g = 1 ./ (exp(-z)+1);
% =============================================================
end
costFunction.m
function [J, grad] = costFunction(theta, X, y)
%COSTFUNCTION Compute cost and gradient for logistic regression
% J = COSTFUNCTION(theta, X, y) computes the cost of using theta as the
% parameter for logistic regression and the gradient of the cost
% w.r.t. to the parameters.
% Initialize some useful values
m = length(y); % number of training examples
% You need to return the following variables correctly
J = 0;
grad = zeros(size(theta));
alpha = 0.01;
% ====================== YOUR CODE HERE ======================
% Instructions: Compute the cost of a particular choice of theta.
% You should set J to the cost.
% Compute the partial derivatives and set grad to the partial
% derivatives of the cost w.r.t. each parameter in theta
%
% Note: grad should have the same dimensions as theta
%
pos = y == 1;
neg = y == 0;
h_pos = sigmoid(X(pos, :) * theta);
J_pos = sum(-log(h_pos));
h_neg = sigmoid(X(neg, :) * theta)