Stanford 机器学习 Week4 作业 Multi-class Classification and Neural Networks

Vectorizing regularized logistic regression

m = length(y); % number of training examples
J = 0;
grad = zeros(size(theta));
J = sum( -y .* log(sigmoid(X*theta)) - (1 - y) .* log(1 - sigmoid(X*theta))) / m + theta(2:end)' * theta(2:end) * lambda / m / 2;

grad = X' * (sigmoid(X * theta) - y) / m;
grad(2:end) = grad(2:end) + lambda/m * theta(2:end);
grad = grad(:);

和上周同样的问题,代码向量化后简洁多了,代码要尽量避免出现loop。
A = A(:)可以矩阵转成列向量

One-vs-all Prediction

m = size(X, 1);
n = size(X, 2);
all_theta = zeros(num_labels, n + 1);
X = [ones(m, 1) X];
initial_theta = zeros(n + 1, 1);
options = optimset('GradObj', 'on', 'MaxIter', 50);

for i = 1:num_labels
    all_theta(i,:) = fmincg (@(t)(lrCostFunction(t, X, (y == i), lambda)),initial_theta, options);
end;

Neural network predict

m = size(X, 1);
num_labels = size(Theta2, 1);
p = zeros(size(X, 1), 1);
X = [ones(m,1) X];
layer2 = sigmoid(X * Theta1');
layer2 = [ones(size(layer2,1),1) layer2];
layer3 = sigmoid(layer2 * Theta2');
[a,b] = max(layer3,[],2);
p = b;

Predict one vs all

m = size(X, 1);
num_labels = size(all_theta, 1);
p = zeros(size(X, 1), 1);
X = [ones(m, 1) X];
tmp = X * all_theta';
[value,index] = max(tmp,[],2);
p = index;
  • 0
    点赞
  • 3
    收藏
    觉得还不错? 一键收藏
  • 1
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值