ELman神经网络matlab实现

ELman神经网络matlab实现

by:Z.H.Gao

一.输入样本

用sin(xt)、sin(2xt)、sin(0.5xt)和时间t,预测cos(xt)
原始数据
FIG.1. 原始数据

二.matlab实现代码

clear all;close all;clc
%sine,‘tanh’,Lrate=0.02,Nohidden=12;
%sine,‘sigmoid’,Lrate=0.2,Nohidden=12;
load sine
sine=sine’;
sine=mapminmax(sine,0,1);
sine=sine’;
t=sine(:,1);
inst=sine(:,2:end-1);
label=sine(:,end);
%%
datalength=90;
trainx=inst(1:datalength,:);
trainy=label(1:datalength,:);
testx=inst(datalength+1:end,:);
testy=label(datalength+1:end,:);
%%
epoch=1000;
Lrate=0.8;
momentum=1;
backstep=20;
ActivationF=‘sigmoid’;
Nohidden=12;%隐藏层节点数目不能取太小
inputW=2 * rand(size(trainx,2),Nohidden)-1;
inputB=rand(1,Nohidden);
inputBW=[inputB;inputW];
outputW=2 * rand(Nohidden,size(trainy,2))-1;
outputB=rand(1,size(trainy,2));
outputBW=[outputB;outputW];
hiddenW=2 * rand(Nohidden,Nohidden)-1;
stateH=zeros(datalength,Nohidden);
%%
for v=1:1:epoch
%%
%正向计算,样本由1到n,顺序输入
for i=1:1:datalength
x=[1,trainx(i,:)];
if i==1
tempH(i,:)=x * inputBW;
else
tempH(i,:)=x * inputBW+stateH(i-1,:) * hiddenW;
end
H = ActivationFunction(tempH(i,:),ActivationF);
stateH(i,:) = H;
tempY(i,1) = [1,H] * outputBW;
end
trainResult = ActivationFunction(tempY,ActivationF);
Error=trainResult-trainy;
trainMSE(v,1)=sum(sum(Error.^2))/datalength;
%%
%反向计算,回溯的样本不能太少
DinputBW=zeros(size(inputBW));
DhiddenW=zeros(size(hiddenW));
Dout=Error. * GradientValue(tempY,ActivationF);
DoutputBW=[ones(datalength,1),stateH]’ * Dout;
DH=Dout * outputBW’;
DH=DH(:,2:end);
for i = datalength: -1 :1
DtempH = DH(i,:). * GradientValue(tempH(i,:),ActivationF);
for bptt_i = i: -1 :max(1,i-backstep)
DinputBW=DinputBW+[1,trainx(bptt_i,:)]’ * DtempH;
if bptt_i-1>0
DhiddenW=DhiddenW+stateH(bptt_i-1,:)’ * DtempH;
DtempH=DtempH*hiddenW’. * GradientValue(tempH(bptt_i-1,:),ActivationF);
end
end
end
%%
inputBW=inputBW-Lrate * DinputBW;
hiddenW=hiddenW-Lrate * DhiddenW;
outputW=outputW-Lrate * DoutputW;
% Lrate=0.9999 * Lrate;
%%
end
%%
%测试过程
for i=1:1:size(testx,1)
x=[1,testx(i,:)];
tempH(i+datalength,:)=x * inputBW+stateH(i+datalength-1,:)*hiddenW;
H = ActivationFunction(tempH(i+datalength,:),ActivationF);
stateH(i+datalength,:) = H;
tempResult(i,1) = [1,H]*outputBW;
end
testResult = ActivationFunction(tempResult,ActivationF);
Error=testResult-testy;
testMSE=sum(sum(Error.^2))/size(testx,1)
%%
t1=t(1:datalength,:);t2=t(datalength+1:end,:);
figure(1);plot(trainMSE);
figure(2);plot(t1,trainy,’-*b’);hold on;plot(t1,trainResult,’-or’);
hold on;plot(t2,testy,’-*k’);hold on;plot(t2,testResult,’-og’);

在这里插入图片描述

训 练 M S E 训练MSE MSE

在这里插入图片描述
E L m a n 计 算 结 果 ELman计算结果 ELman

三. matlab tool box 实现ELman

clear all;close all;clc
%%%%%%%%%%%%%%%%%%%%type、open、edit可以打开源代码%%%%%%%%%%%%%%%%%%%%
load sine
t=sine(:,1);
inst=sine(:,2:end-1);
label=sine(:,end);
%%
datalength=90;
trainx=inst(1:datalength,:)’;
trainy=label(1:datalength,:)’;
testx=inst(datalength+1:end,:)’;
testy=label(datalength+1:end,:)’;
%%
TF1=‘tansig’;TF2=‘tansig’;%‘tansig’,‘purelin’,‘logsig’
net=newelm(trainx,trainy,[6,4],{TF1 TF2},‘traingda’);
net.trainParam.epochs=1000;
net.trainParam.goal=1e-7;
net.trainParam.lr=0.5;
net.trainParam.mc=0.9;%动量因子的设置,默认为0.9
net.trainParam.show=25;%显示的间隔次数
net.trainFcn=‘traingda’;
net.divideFcn=’’;
[net,tr]=train(net,trainx,trainy);
[trainoutput,trainPerf]=sim(net,trainx,[],[],trainy);%sim(网络,输入,初始输入延迟,初始层延迟,输出,初始输出延迟,最终层延迟)
[testoutput,testPerf]=sim(net,testx,[],[],testy);%测试数据,经BP得到的结果;
%%
MSE=mse(testoutput-testy)
figure(1)
t1=t(1:datalength,:);t2=t(datalength+1:end,:);
plot(t1,trainy,’-k*’);hold on;plot(t1,trainoutput,’-g*’);
plot(t2,testy,’-b*’);hold on;plot(t2,testoutput,’-r*’);

参考文献

[1] https://zhuanlan.zhihu.com/p/26891871
[2] https://zhuanlan.zhihu.com/p/26892413
[3] https://zybuluo.com/hanbingtao/note/541458

  • 3
    点赞
  • 27
    收藏
    觉得还不错? 一键收藏
  • 3
    评论
以下是使用蜂群算法优化ELman神经网络MATLAB程序: ```matlab % 蜂群算法优化ELman神经网络 % 初始化 clear clc global train_input train_target test_input test_target N I H K load Data.mat % 加载数据 train_input = Data.train_input; train_target = Data.train_target; test_input = Data.test_input; test_target = Data.test_target; N = size(train_input, 2); % 样本数 I = size(train_input, 1); % 输入层节点数 H = 10; % 隐层节点数 K = 1; % 输出层节点数 Foods = 30; % 食物数量 Limit = 100; % 迭代次数 Range = 10; % 搜索范围 SP = 0.6; % 固定搜索概率 SN = 5; % 邻域搜索次数 FoodPosition = zeros(Foods, H * (I + H + K) + K); % 食物位置 FoodSource = zeros(Foods, 1); % 食物源 GlobalMin = realmax; % 全局最优解 GlobalParams = zeros(1, H * (I + H + K) + K); % 全局最优参数 Fitness = zeros(Foods, 1); % 适应度值 Max = zeros(Limit, 1); % 最大适应度值 Mean = zeros(Limit, 1); % 平均适应度值 % 初始化食物位置和适应度值 for i = 1:Foods FoodPosition(i, :) = rand(1, H * (I + H + K) + K) * Range * 2 - Range; [Fitness(i), ~] = BPNN(FoodPosition(i, :)); if Fitness(i) < GlobalMin GlobalMin = Fitness(i); GlobalParams = FoodPosition(i, :); end end % 迭代搜索 for t = 1:Limit % 邻域搜索 for i = 1:Foods for j = 1:SN NewFoodPosition = FoodPosition(i, :) + rand(1, H * (I + H + K) + K) * Range * 2 - Range; [NewFitness, ~] = BPNN(NewFoodPosition); if NewFitness < Fitness(i) FoodPosition(i, :) = NewFoodPosition; Fitness(i) = NewFitness; end if NewFitness < GlobalMin GlobalMin = NewFitness; GlobalParams = NewFoodPosition; end end end % 固定搜索 for i = 1:Foods if rand() < SP NewFoodPosition = GlobalParams + rand(1, H * (I + H + K) + K) * Range * 2 - Range; [NewFitness, ~] = BPNN(NewFoodPosition); if NewFitness < Fitness(i) FoodPosition(i, :) = NewFoodPosition; Fitness(i) = NewFitness; end if NewFitness < GlobalMin GlobalMin = NewFitness; GlobalParams = NewFoodPosition; end end end % 记录最大和平均适应度值 Max(t) = max(Fitness); Mean(t) = mean(Fitness); end % 绘制适应度值变化图 figure(1); plot(Max, 'r-'); hold on; plot(Mean, 'b--'); xlabel('迭代次数'); ylabel('适应度值'); legend('最大适应度值', '平均适应度值'); % 测试 [~, output] = BPNN(GlobalParams, 1); output = round(output); accuracy = sum(output == test_target) / length(test_target); disp(['测试准确率为:', num2str(accuracy)]); % BP神经网络 function [fitness, output] = BPNN(params, test) global train_input train_target test_input test_target N I H K W1 = reshape(params(1:H * I), H, I); B1 = reshape(params(H * I + 1:H * I + H), H, 1); W2 = reshape(params(H * I + H + 1:H * I + H + H * K), K, H); B2 = reshape(params(H * I + H + H * K + 1:end), K, 1); if test == 0 % 训练 net = newelm(train_input, train_target, H, {'tansig', 'purelin'}, 'traingd', 'learngd', 'mse'); net.IW{1,1} = W1; net.b{1} = B1; net.LW{2,1} = W2; net.b{2} = B2; net.trainParam.lr = 0.1; net.trainParam.epochs = 100; net.trainParam.goal = 0.01; net = train(net, train_input, train_target); output = sim(net, train_input); fitness = mse(output - train_target); else % 测试 net = newelm(train_input, train_target, H, {'tansig', 'purelin'}, 'traingd', 'learngd', 'mse'); net.IW{1,1} = W1; net.b{1} = B1; net.LW{2,1} = W2; net.b{2} = B2; output = sim(net, test_input); fitness = mse(output - test_target); end end ``` 代码中使用了ELman神经网络,其中蜂群算法用于优化神经网络的权重和偏置参数。通过运行程序,可以得到神经网络的最优参数和测试准确率。同时,程序还会输出适应度值变化图,以便分析算法的收敛情况。

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论 3
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值