基于MATLAB的BP神经网络的算法实现

算法层面实现BP神经网络

算法教程源自于Andrew Ng在coursera上的Machine Learning;参数部分有:DATA;NN_Size;lambda;maxIter;alpha;修改这些参数会影响神经网络的表现;
代码如下:

% neural_Network
DATA=[4 2 0;6 2 0;5 3 0;4 4 0;4 6 0;5 5 0;6 6 0;7 4 0;8 3 0;7 8 0;9 5 0;
      2 9 1;3 9 1;4 11 1;5 11 1;7 11 1;9 11 1;5 8 1;7 9 1;9 9 1;6 10 1;8 12 1;
      9 7 2;10 2 2;10 4 2;10 6 2;11 5 2;11 7 2;11 8 2;12 3 2;12 5 2;12 7 2;12 9 2;13 4 2;13 6 2;];
NN_Size = [4;4;3];
lambda =0.01;
maxIter = 3000;
alpha =0.2;
%*****************************************************************************
% explain DATA
[m,n]=size(DATA);
NN_Size = [n-1;NN_Size];
numOfLayers =  numel(NN_Size);
%--------------------------------------------------------------------------
% rand-initialize weights: epsilon_init = sqrt(6/(L_in+L_out))
% in this case i briefly assume its NeuralNetworkSize (1:2)
epsilon_init = sqrt(6/(NN_Size(1)+NN_Size(end)));
Theta=cell(1,numOfLayers-1);%<<<<
for i = 1:numOfLayers-1
    Theta{i}=rand(NN_Size(i+1),NN_Size(i)+1)*2*epsilon_init-epsilon_init;
end
%--------------------------------------------------------------------------
% X:all input data ; Y: all practical output 
X = DATA(:,1:(end-1));      %<<<<
y = DATA(:,end);
y_set = unique(y);
Y=zeros(numel(y_set),m);    %<<<<
for i=1:numel(y_set)
    Y(i,:) = y==y_set(i);
end
A = cell(1,numOfLayers);
J=zeros(maxIter);
A{1} = [ones(m,1),X]'; % size: 3Xm;
for iteration = 1:maxIter
%*****************************************************************************
% get neural network output
    for layer = 1:numOfLayers-1
         A{layer+1} = [ones(1,m);1./(1+exp(-Theta{layer}*A{layer}))];
    end
%--------------------------------------------------------------------------
% calc J
    J(iteration) = (-1/m)*trace(Y'*log(A{numOfLayers}(2:end,:))+(1-Y')*log(1-A{numOfLayers}(2:end,:)));
%*****************************************************************************
% back propagation by gradient discent ;
Delta =cell(1,numOfLayers-1);
Theta1_grad = cell(1,numOfLayers-1);
delta = A{end}(2:end,:)-Y;  % delete output a0;
Delta{end} = delta*A{end-1}';
for layer_BP = numOfLayers-1:-1:2 % we needn't calc delta of input 
%--------------------------------------------------------------------------
    delta = Theta{layer_BP}'*delta.*A{layer_BP}.*(1-A{layer_BP});
    delta = delta(2:end,:);
    Delta{layer_BP-1} = delta*A{layer_BP-1}';
end
%*****************************************************************************
%gradient descent
for i = 1:numOfLayers-1
    Theta{i} = Theta{i} - alpha/m*Delta{i};
    Theta{i}(:,2:end) = Theta{i}(:,2:end)-lambda/m*Theta{i}(:,2:end);
end
end
%*****************************************************************************
% test
%--------------------------------------------------------------------------
% plot J
figure('name','J')
plot(J);
%*****************************************************************************
% plot classification
%--------------------------------------------------------------------------
% neural network
a = [ones(m,1),X]'; % size: 3 X m;
for layer = 1:numOfLayers-1
    a = [ones(1,m);1./(1+exp(-Theta{layer}*a))];
end
%--------------------------------------------------------------------------
% plot classification
[~,indx] = max(a(2:end,:));
classes = unique(indx);
figure('name','classification')
hold on
for i = 1:numel(classes)
    plot(X(indx==classes(i),1),X(indx==classes(i),2),'o')
end
%--------------------------------------------------------------------------
%show raw DATA
figure('name','show DATA')
hold on
for i=1:numel(classes)
    indx1 = find(DATA(:,3)==y_set(i));
    plot(DATA(indx1,1),DATA(indx1,2),'o');
end
Accuracy=sum((indx-1)'-y==0)/m

输出结果:Accuracy =0.9714;

  • 0
    点赞
  • 11
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值