此次编程作业分为两部分,是分别用one-vs-all逻辑回归和神经网络来实现识别手写数字0-9。
1 Multi-class Classification
1.1 Dataset
此次作业给出了5000组测试数据,每个测试数据由2020的像素组成,每个像素都由一个浮点数组成,可以将这2020的像素转化成向量的形式。每个测试数据组成矩阵X的一行,那么矩阵X是一个5000*400的矩阵了。
X
=
[
(
x
(
1
)
)
T
(
x
(
2
)
)
T
(
x
(
3
)
)
T
.
.
.
(
x
(
m
)
)
T
]
X = \begin{bmatrix} (x^{(1)})^{T}\\ (x^{(2)})^{T}\\ (x^{(3)})^{T}\\ ...\\ (x^{(m)})^{T}\\ \end{bmatrix}
X=⎣⎢⎢⎢⎢⎡(x(1))T(x(2))T(x(3))T...(x(m))T⎦⎥⎥⎥⎥⎤
因为Octave/MATLAB没有从0开始的前驱,所以我们可以将0标记成10来表示。
1.2 Visualizing the data
这里随机抽取了100组数据进行了可视化。
1.3 Vectorizing Logistic Regression
用向量运算完成逻辑回归的的代价函数和梯度
注意:这里的代价函数和梯度需要正则化
这一部分是要手动完成的,这里我附上代码
lrCostFunction.m
function [J, grad] = lrCostFunction(theta, X, y, lambda)
%LRCOSTFUNCTION Compute cost and gradient for logistic regression with
%regularization
% J = LRCOSTFUNCTION(theta, X, y, lambda) computes the cost of using
% theta as the parameter for regularized logistic regression and the
% gradient of the cost w.r.t. to the parameters.
% Initialize some useful values
m = length(y); % number of training examples
% You need to return the following variables correctly
J = 0;
grad = zeros(size(theta));
% ====================== YOUR CODE HERE ======================
% Instructions: Compute the cost of a particular choice of theta.
% You should set J to the cost.
% Compute the partial derivatives and set grad to the partial
% derivatives of the cost w.r.t. each parameter in theta
%
% Hint: The computation of the cost function and gradients can be
% efficiently vectorized. For example, consider the computation
%
% sigmoid(X * theta)
%
% Each row of the resulting matrix will contain the value of the
% prediction for that example. You can make use of this to vectorize
% the cost function and gradient computations.
%
% Hint: When computing the gradient of the regularized cost function,
% there're many possible vectorized solutions, but one solution
% looks like:
% grad = (unregularized gradient for logistic regression)
% temp = theta;
% temp(1) = 0; % because we don't add anything for j = 0
% grad = grad + YOUR_CODE_HERE (using the temp variable)
%
g = X*theta;
h =sigmoid(g);
J = (-1/m)*((y'*log(h))+(1-y)'*log(1-h))+(lambda/(2*m))*(theta'*theta-theta(1)*theta(1));
grad = (1/m)*(X'*(h-y))+(lambda/m)*theta;
grad(1) = grad(1) -(lambda/m)*theta(1);
% =============================================================
grad = grad(:);
end
1.4 One-vs-all Classification
这一部分是要完成一对多分类,一对多分类可以转成一般的逻辑回归问题,可以将属于此类的数据用1表示,不属于此类的数据用0表示,这样就转化成了多次逻辑回归问题,这样可以通过一个向量来储存结果,向量中的每个元素表示是元素所表示的类的概率,我们取用概率最大的类别表示预测结果。
这一部分也需要我们完成,附上代码:
predictOneVsAll.m
function p = predictOneVsAll(all_theta, X)
%PREDICT Predict the label for a trained one-vs-all classifier. The labels
%are in the range 1..K, where K = size(all_theta, 1).
% p = PREDICTONEVSALL(all_theta, X) will return a vector of predictions
% for each example in the matrix X. Note that X contains the examples in
% rows. all_theta is a matrix where the i-th row is a trained logistic
% regression theta vector for the i-th class. You should set p to a vector
% of values from 1..K (e.g., p = [1; 3; 1; 2] predicts classes 1, 3, 1, 2
% for 4 examples)
m = size(X, 1);
num_labels = size(all_theta, 1);
% You need to return the following variables correctly
p = zeros(size(X, 1), 1);
% Add ones to the X data matrix
X = [ones(m, 1) X];
% ====================== YOUR CODE HERE ======================
% Instructions: Complete the following code to make predictions using
% your learned logistic regression parameters (one-vs-all).
% You should set p to a vector of predictions (from 1 to
% num_labels).
%
% Hint: This code can be done all vectorized using the max function.
% In particular, the max function can also return the index of the
% max element, for more information see 'help max'. If your examples
% are in rows, then, you can use max(A, [], 2) to obtain the max
% for each row.
%
pre = X*all_theta';
[w,p] = max(pre,[],2);
% =========================================================================
end
2 Neural Networks
这一部分也是完成跟第一部分相同的功能。
Model representation
因为有400个特征值,所以我们的输入层就有400+1(
x
0
x_0
x0)的输入,这里我们设置隐藏层的结点数为25个,因为要识别0-9这十个数字,那么就需要10个输出节点,这个跟多分类逻辑回归相似。
建立的神经网络如下:
2.2 Feedforward Propagation and Prediction
这一部分需要我们自己完成神经网络,这里已经将所有权值给出,不需要我们自己去求。我们只需要编写向量计算就可以了。
这里附上我的代码:
predict.m
function p = predict(Theta1, Theta2, X)
%PREDICT Predict the label of an input given a trained neural network
% p = PREDICT(Theta1, Theta2, X) outputs the predicted label of X given the
% trained weights of a neural network (Theta1, Theta2)
% Useful values
m = size(X, 1);
num_labels = size(Theta2, 1);
% You need to return the following variables correctly
p = zeros(size(X, 1), 1);
% ====================== YOUR CODE HERE ======================
% Instructions: Complete the following code to make predictions using
% your learned neural network. You should set p to a
% vector containing labels between 1 to num_labels.
%
% Hint: The max function might come in useful. In particular, the max
% function can also return the index of the max element, for more
% information see 'help max'. If your examples are in rows, then, you
% can use max(A, [], 2) to obtain the max for each row.
%
X = [ones(m,1),X];
a_2 = sigmoid(X*Theta1');
a_2 = [ones(m,1),a_2];
a_3 = sigmoid(a_2*Theta2');
[w,p] = max(a_3,[],2);
% =========================================================================
end
以上就是此次编程作业的内容。