>吴恩达机器学习课程链接
>课程总结和笔记链接
实验一的原始代码和使用数据可至课程链接-课时45-章节6编程作业中下载
Machine Learning Online Class - Exercise 1: Linear Regression
包括热身练习、单变量/多变量的损失函数计算、梯度下降的参数更新、特征归一化、正规方程等
环境——Matlab R2018b/Octave
单特征
Part 1: Basic Function
输出一个5X5单位矩阵
warmUpExercise.m
function A = warmUpExercise()
%WARMUPEXERCISE Example function in octave
% A = WARMUPEXERCISE() is an example function that returns the 5x5 identity matrix
A = [];
% ============= YOUR CODE HERE ==============
% Instructions: Return the 5x5 identity matrix
% In octave, we return values by defining which variables
% represent the return values (at the top of the file)
% and then set them accordingly.
A = eye(5);
% ===========================================
end
运行结果
Part 2: Plotting
显示数据集ex1data1.txt
plotData.m
function plotData(x, y)
%PLOTDATA Plots the data points x and y into a new figure
% PLOTDATA(x,y) plots the data points and gives the figure axes labels of
% population and profit.
figure; % open a new figure window
% ====================== YOUR CODE HERE ======================
% Instructions: Plot the training data into a figure using the
% "figure" and "plot" commands. Set the axes labels using
% the "xlabel" and "ylabel" commands. Assume the
% population and revenue data have been passed in
% as the x and y arguments of this function.
%
% Hint: You can use the 'rx' option with plot to have the markers
% appear as red crosses. Furthermore, you can make the
% markers larger by using plot(..., 'rx', 'MarkerSize', 10);
plot(x, y, 'rx', 'MarkerSize', 10);
xlabel('population');
ylabel('revenue');
% ============================================================
end
运行结果
Part 3: Cost and Gradient descent
损失函数和梯度下降
computeCost.m
function J = computeCost(X, y, theta)
%COMPUTECOST Compute cost for linear regression
% J = COMPUTECOST(X, y, theta) computes the cost of using theta as the
% parameter for linear regression to fit the data points in X and y
% Initialize some useful values
m = length(y); % number of training examples
% You need to return the following variables correctly
J = 0;
% ====================== YOUR CODE HERE ======================
% Instructions: Compute the cost of a particular choice of theta
% You should set J to the cost.
predictions = X * theta;
sqrErrors = (predictions - y).^2;
J = 1 / (2 * m) * sum(sqrErrors);
% =========================================================================
end
gradientDescent.m
function [theta, J_history] = gradientDescent(X, y, theta, alpha, num_iters)
%GRADIENTDESCENT Performs gradient descent to learn theta
% theta = GRADIENTDESCENT(X, y, theta, alpha, num_iters) updates theta by
% taking num_iters gradient steps with learning rate alpha
% Initialize some useful values
m = length(y); % number of training examples
J_history = zeros(num_iters, 1);
for iter = 1:num_iters
% ====================== YOUR CODE HERE ======================
% Instructions: Perform a single gradient step on the parameter vector
% theta.
%
% Hint: While debugging, it can be useful to print out the values
% of the cost function (computeCost) and gradient here.
%
predictions0 = 1 / m * sum(X * theta - y);
theta(1) = theta(1) - alpha * predictions0;
predictions1 = 1 / m * sum((X * theta - y) .* X(:,2));
theta(2) = theta(2) - alpha * predictions1;
% ================================================