1 简介
人工神经网络的最大缺点是训练时间太长从而限制其实时应用范围,近年来,极限学习机(Extreme Learning Machine, ELM)的提出使得前馈神经网络的训练时间大大缩短,然而当原始数据混杂入大量噪声变量时,或者当输入数据维度非常高时,极限学习机算法的综合性能会受到很大的影响.深度学习算法的核心是特征映射,它能够摒除原始数据中的噪声,并且当向低维度空间进行映射时,能够很好的起到对数据降维的作用,因此我们思考利用深度学习的优势特性来弥补极限学习机的弱势特性从而改善极限学习机的性能.为了进一步提升DELM预测精度,本文采用麻雀搜索算法进一步优化DELM超参数,仿真结果表明,改进算法的预测精度更高。
2 部分代码
%_________________________________________________________________________%
%狮群算法 %
%_________________________________________________________________________%
function [Best_pos,Best_score,curve]=LSO(pop,Max_iter,lb,ub,dim,fobj)
beta = 0.5;%成年狮所占比列
Nc = round(pop*beta);%成年狮数量
Np = pop-Nc;%幼师数量
if(max(size(ub)) == 1)
ub = ub.*ones(1,dim);
lb = lb.*ones(1,dim);
end
%种群初始化
X0=initialization(pop,dim,ub,lb);
X = X0;
%计算初始适应度值
fitness = zeros(1,pop);
for i = 1:pop
fitness(i) = fobj(X(i,:));
end
[value, index]= min(fitness);%找最小值
GBestF = value;%全局最优适应度值
GBestX = X(index,:);%全局最优位置
curve=zeros(1,Max_iter);
XhisBest = X;
fithisBest = fitness;
indexBest = index;
gbest = GBestX;
for t = 1: Max_iter
%母狮移动范围扰动因子计算
stepf = 0.1*(mean(ub) - mean(lb));
alphaf = stepf*exp(-30*t/Max_iter)^10;
%幼狮移动范围扰动因子计算
alpha = (Max_iter - t)/Max_iter;
%母狮位置更新
for i = 1:Nc
index = i;
while(index == i)
index = randi(Nc);%随机挑选一只母狮
end
X(i,:) = (X(i,:) + X(index,:)).*(1 + alphaf.*randn())./2;
end
%幼师位置更新
for i = Nc+1:pop
q=rand;
if q<=1/3
X(i,:) = (gbest + XhisBest(i,:)).*( 1 + alpha.*randn())/2;
elseif q>1/3&&q<2/3
indexT = i;
while indexT == i
indexT = randi(Nc) + pop - Nc;%随机位置
end
X(i,:) = (X(indexT,:) + XhisBest(i,:)).*( 1 + alpha.*randn())/2;
else
gbestT = ub + lb - gbest;
X(i,:) = (gbestT + XhisBest(i,:)).*( 1 + alpha.*randn())/2;
end
end
%边界控制
for j = 1:pop
for a = 1: dim
if(X(j,a)>ub)
X(j,a) =ub(a);
end
if(X(j,a)<lb)
X(j,a) =lb(a);
end
end
end
%计算适应度值
for j=1:pop
fitness(j) = fobj(X(j,:));
end
for j = 1:pop
if(fitness(j)<fithisBest(j))
XhisBest(j,:) = X(j,:);
fithisBest(j) = fitness(j);
end
if(fitness(j) < GBestF)
GBestF = fitness(j);
GBestX = X(j,:);
indexBest = j;
end
end
%% 狮王更新
Temp = gbest.*(1 + randn().*abs(XhisBest(indexBest,:) - gbest));
Temp(Temp>ub)=ub(Temp>ub);
Temp(Temp<lb) = lb(Temp<lb);
fitTemp = fobj(Temp);
if(fitTemp<GBestF)
GBestF =fitTemp;
GBestX = Temp;
X(indexBest,:)=Temp;
fitness(indexBest) = fitTemp;
end
[value, index]= min(fitness);%找最小值
gbest = X(index,:);%当前代,种群最优值
curve(t) = GBestF;
end
Best_pos = GBestX;
Best_score = curve(end);
end
- 1.
- 2.
- 3.
- 4.
- 5.
- 6.
- 7.
- 8.
- 9.
- 10.
- 11.
- 12.
- 13.
- 14.
- 15.
- 16.
- 17.
- 18.
- 19.
- 20.
- 21.
- 22.
- 23.
- 24.
- 25.
- 26.
- 27.
- 28.
- 29.
- 30.
- 31.
- 32.
- 33.
- 34.
- 35.
- 36.
- 37.
- 38.
- 39.
- 40.
- 41.
- 42.
- 43.
- 44.
- 45.
- 46.
- 47.
- 48.
- 49.
- 50.
- 51.
- 52.
- 53.
- 54.
- 55.
- 56.
- 57.
- 58.
- 59.
- 60.
- 61.
- 62.
- 63.
- 64.
- 65.
- 66.
- 67.
- 68.
- 69.
- 70.
- 71.
- 72.
- 73.
- 74.
- 75.
- 76.
- 77.
- 78.
- 79.
- 80.
- 81.
- 82.
- 83.
- 84.
- 85.
- 86.
- 87.
- 88.
- 89.
- 90.
- 91.
- 92.
- 93.
- 94.
- 95.
- 96.
- 97.
- 98.
- 99.
- 100.
- 101.
3 仿真结果
4 参考文献
[1]马萌萌. 基于深度学习的极限学习机算法研究[D]. 中国海洋大学, 2015.