MOOC Machine Learning 作业交流帖4

wk4 Neural Networks Learning

最近正在学习MOOC上的经典课程:Machine learning (by Andrew Ng), 具体课程链接:MACHINE LEARNING
根据进度将作业的关键代码部分贴上,仅供交流与讨论。

  • sigmoidGradient
g=sigmoid(z).*(1-sigmoid(z));
  • randInitializeWeights
epsilon_init=0.12;
W=rand(L_out,1+L_in)*2*epsilon_init-epsilon_init;
  • nnCostFunction
X = [ones(m, 1) X];
z2=X*Theta1'; %%%5000x25
a2=sigmoid(z2);
a2=[ones(size(a2,1),1) a2]; %%%5000x26

z3=a2*Theta2';
h=sigmoid(z3); %%%% h: 5000x10

%%%change '2' to '0100000000','10'to'0000000001', Y: 5000x10
Y=zeros(m,num_labels);
for ind = 1:m 
    Y(ind, y(ind)) = 1; 
end

J=sum(sum(-Y.*log(h)-(1-Y).*log(1-h),2))/m;

%%%%--------first sum by row: sum( x,2) then sum by coloum 
r=(sum(sum(Theta1(:,2:end).^2,2))+sum(sum(Theta2(:,2:end).^2,2)))*lambda/(2*m);
J=J+r;

D1=zeros(hidden_layer_size,1+input_layer_size);%%%%25x401 ,,,to calculate Theta1_grad
D2=zeros(num_labels,1+hidden_layer_size);%%%%%10x26,,,,to calculate Theta2_grad


delta3=h-Y; %%%%5000x10
Z2=[ones(size(z2,1),1) z2];
delta2=(delta3*Theta2).*sigmoidGradient(Z2); %%%%5000x26

D1=D1+delta2(:,2:end)'*X; %%%%%% 25x401
D2=D2+delta3'*a2; %%%%% 10x26

Theta1_grad=D1/m;
Theta1_grad(:,2:end)=Theta1_grad(:,2:end)+ Theta1(:,2:end)*lambda/m;
Theta2_grad=D2/m;
Theta2_grad(:,2:end)=Theta2_grad(:,2:end)+ Theta2(:,2:end)*lambda/m;

% Unroll gradients
grad = [Theta1_grad(:) ; Theta2_grad(:)];

——转载请注明出处

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值