Machine Learning 第四波编程作业 - Neural Networks: Learning

仅列出核心代码:

1.sigmoidGradient.m

h = 1.0 ./ (1.0 + exp(-z));
g = h.*(1 - h);

2.randInitializeWeights.m

epsilon_init = 0.12;
W = rand(L_out, 1 + L_in)*2*epsilon_init - epsilon_init;

3.nnCostFunction.m

% cost function

A1 = X;
A1 = [ones(m, 1), A1];
Z2 = A1 * Theta1.';
A2 = sigmoid(Z2);
A2 = [ones(m, 1), A2];
Z3 = A2 * Theta2.';
A3 = sigmoid(Z3);
H = A3;
Y = zeros(m, num_labels);
for ind = 1:m
Y(ind, y(ind)) = 1;
end
K = num_labels;
Jk = zeros(K, 1);

for k =1:K

Jk(k) = ( -Y(:, k).' *log(H(:, k)) )-( (1 - Y(:, k)).' * log(1-H(:, k)) );

end
J = sum(Jk)/m;

J = J + ( lambda/(2*m) )*( sum(sum(Theta1(:, 2:end).^2))+sum(sum(Theta2(:, 2:end).^2)) );

% Unroll gradients

delta3 = A3 - Y;
delta2 = delta3*Theta2.* (A2.*(1-A2));
delta2 = delta2(:, 2: end);
Delta2 = zeros(size(delta3, 2), size(A2, 2));
Delta1 = zeros(size(delta2, 2), size(A1, 2));
for i=1:m
Delta2 = Delta2 + delta3(i, :).' * A2(i, :);
Delta1 = Delta1 + delta2(i, :).' * A1(i, :);
end
Theta1_grad = Delta1/m;
Theta1_grad(:, 2:end) = Theta1_grad(:, 2:end) + Theta1(:, 2:end)*(lambda/m);
Theta2_grad = Delta2/m;
Theta2_grad(:, 2:end) = Theta2_grad(:, 2:end) + Theta2(:, 2:end)*(lambda/m);
grad = [Theta1_grad(:) ; Theta2_grad(:)];
end

课程地址:https://www.coursera.org/course/ml

转载于:https://www.cnblogs.com/andnot/archive/2013/05/27/ml_code4.html

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值