第二次作业答案与解析
1 Logistic Regression
1.1 Visualizing the data (plotData.m)
% Find Indices of Positive and Negative Examples
pos = find(y == 1);
neg = find(y == 0);
% Plot Examples
plot(X(pos, 1), X(pos, 2), 'k+','LineWidth', 2, 'MarkerSize', 7);
plot(X(neg, 1), X(neg, 2), 'ko', 'MarkerFaceColor', 'y', 'MarkerSize', 7);
运行结果:
1.2 Implementation
1.2.1 sigmoid function (sigmoid.m)
公式如下:
g
(
z
)
=
1
1
+
e
−
z
g(z)=\frac{1}{1+e^{-z}}
g(z)=1+e−z1
h
θ
(
x
)
=
1
1
+
e
−
θ
T
x
h_θ(x) = \frac{1}{1+e^{-θ^Tx}}
hθ(x)=1+e−θTx1
答案:
g = 1./(1+exp(-z));
1.2.2 Cost function and gradient (costFunction.m)
J = -1/m * (y'*log(sigmoid(X*theta))+(1-y)'*log(1-sigmoid(X*theta)));
grad = 1/m * X' * (sigmoid(X*theta)-y);
对我来说主要问题还是想矩阵相乘转置的问题,弄清楚了剩下的只要代进去就可以了。
运行结果;
1.2.3 Learning parameters using fminunc
这个已经给出了代码
运行结果:
1.2.4 Evaluating logistic regression (predict.m)
p = round(sigmoid(X*theta));
运行结果:
2 Regularized logistic regression
2.1 Visualizing the data(plotData.m)
可视化:
2.2 Feature mapping (mapFeature.m)
目的是为了提取更多特征,代码已经给出,不必修改。
2.3 Cost function and gradient (costFunctionReg.m)
公式:
需要注意
θ
θ
θ的下标。
J = -1/m * (y'*log(sigmoid(X*theta))+(1-y)'*log(1-sigmoid(X*theta))) + lambda/(2*m)*sum(theta(2:end).^2);
grad(1,:) = 1/m * (X(:, 1)' * (sigmoid(X*theta) - y));
grad(2:size(theta), :) = 1/m * (X(:, 2:size(theta))' * (sigmoid(X*theta) - y)) + lambda/m*theta(2:size(theta), :);
结果:
2.3.1 Learning parameters using fminunc
代码已给出
% Set Options
options = optimset('GradObj', 'on', 'MaxIter', 400);
% Optimize
[theta, J, exit_flag] = fminunc(@(t)(costFunctionReg(t, X, y, lambda)), initial_theta, options);
2.4 Plotting the decision boundary (plotDecisionBoundary.m)
代码已给出
结果
2.5 Optional (ungraded) exercises
改变
λ
λ
λ的值,看边界变化,下图为
λ
=
100
λ=100
λ=100时的边界。