matlab的narx的使用,matlab NARX做时间序列预测的问题

NARX 神经网络做一个时间序列预测的时候碰到一些问题

1.目标:用input(178*2)预测output(178*1),数据和程序附后

2.问题:

1)训练不多几次就会出现的时候 “Maximum MU reached” 从而训练停止,这个该如何修正?

2)大家看看就这个模型神经元个数滞后期是怎么选取好呢?我试的效果是怎么都不好,额额

3)这个模型拟合效果非常差,有什么办法?是模型设置的问题,还是数据本身不适合NARX?

刚刚开始学习神经网络,欢迎大家各种的指出问题,感谢,非常感谢!!!

数据:

input =

10872.8373015874 1689.60634920634

11339.7753968256 2115.03333333333

10913.6166666668 1950.79563492063

9847.67222222229 1783.36468253968

11097.2992063493 1814.22261904761

11061.2742063493 2258.42817460317

10354.1718614719 1992.64880952380

9818.05238095238 2190.72301587301

9795.04920634914 1818.13690476190

7974.02658730149 1675.79761904761

9038.48809523801 1733.78888888888

8461.31309523798 1895.97619047618

9437.25595238097 2400.22142857143

9236.41190476188 2009.86428571428

9769.72063492062 1907.91349206349

9208.16948051945 2122.44047619047

9402.66428571426 2301.70476190476

9615.39285714285 2031.05476190475

7972.20873015860 1634.88015873015

7661.97539682528 1549.70952380952

7826.03571428558 1424.63809523809

7477.40753968242 1472.19285714286

7493.69404761892 1500.39285714285

8245.06190476179 2023.99880952380

8077.04444444430 1780.96190476190

8283.84404761892 1638.12261904762

8381.96428571419 1745.37380952381

8453.45476190465 1575.06190476190

9320.94920634917 1959.02976190476

9461.73809523816 1854.09761904762

9313.65158730152 1929.56190476190

8558.53928571421 1547.90952380952

8656.99087301579 1476.63412698412

9635.40952380952 1826.56904761904

8656.55119047611 1727.24166666666

8867.37619047613 1817.63095238095

8949.49404761894 1897.17539682539

9252.14880952377 1707.39047619047

7742.36587301575 1376.65476190476

6969.23690476183 1291.59523809524

7607.10714285705 1475.79642857143

7716.09841269829 1414.91190476190

9371.79404761901 1544.30357142857

12640.6428571430 2395.94285714286

9130.21865079358 1590.05238095238

7258.29682539671 1220.97857142857

9023.30238095233 1608.37738095238

8515.98849206340 1646.65952380952

9694.23253968254 1801.65714285714

10375.9714285715 1905.58571428571

9757.79761904763 2034.60238095236

10856.8007326008 1893.21904761904

10426.7130952382 1753.73571428571

10851.1559523810 2032.22976190476

10337.2023809524 2002.57857142856

14253.4508658012 2164.42900432900

10875.7063492064 2178.00357142857

10863.2115079366 2331.01547619048

10865.5404761906 2268.41785714285

12336.6920634923 2259.35714285714

12164.6920634923 2378.08809523810

12062.3000000002 2520.27142857143

14104.7218253971 3396.21428571430

13086.0301587304 2645.96904761905

11768.4825396827 2476.24642857144

11694.0055555558 2835.65555555557

12050.4626984129 2421.40714285714

11965.9230158732 2486.65912698413

12154.4599206351 2540.73253968255

11292.1417027418 2087.74642857143

10375.5309523810 2288.51190476190

9771.01428571427 2231.47142857143

10677.9702380953 2403.34404761905

9126.85317460313 1969.49246031745

8959.89444444438 1828.20476190476

10414.2666666667 2254.98571428571

10014.4392857143 2074.08809523809

9266.55634920629 2329.51349206349

9935.53214285714 2492.29126984127

10154.8436507937 2302.03928571428

10038.8067460318 2466.17976190476

10461.3071428572 3057.15595238096

10093.0952380953 2154.57380952380

10318.8690476191 1939.07857142856

11266.0198412699 2276.35952380952

10964.1706349207 2246.96547619047

10728.4547619048 2111.23055555555

9682.37777777778 1833.88809523809

10581.5087301588 2287.66309523810

10682.3579365080 2056.09325396825

10156.6095238096 2161.97976190476

10603.6599567100 2377.90515873016

10148.2329725830 2277.56825396825

10794.8087301588 1920.23611111111

11421.1956349208 2137.62738095237

10586.2865079365 2065.42896825396

11047.6063852815 2103.04523809523

10515.9206349207 2261.92976190476

10284.2333333334 2504.81785714286

10262.8535714286 2376.45873015873

9592.39761904760 2110.35833333333

9668.35714285713 1903.06428571428

9287.30634920632 2080.27380952380

9805.25833333335 2620.92619047619

9947.53214285718 2410.62380952381

10315.9702380953 2589.43849206350

10931.2023809525 2388.93690476190

9873.51468253969 2389.22976190476

8887.29285714279 2231.01190476190

9248.42896825394 2060.25238095237

8814.07499999992 1950.25714285714

8729.06587301581 1916.78452380952

8154.68690476178 1915.43333333333

8391.34166666655 1775.12579365079

8075.93095238085 1519.90277777778

7555.03535353527 1446.87976190476

7152.69603174595 1453.06825396825

6689.68690476184 1562.17301587301

7298.10476190468 1718.18095238095

7160.21428571421 1561.03571428571

6408.19999999994 1287.60238095238

6883.83968253962 1426.89880952381

6427.16666666661 1306.14761904762

5940.86626984122 1367.51349206349

6344.53253968246 1592.46666666666

6647.62142857135 1689.83809523809

7014.76666666661 1354.90238095238

7005.11428571424 1440.00634920635

6714.63214285709 1406.06230158730

6216.37023809518 1378.16904761905

6437.40595238087 1258.68571428571

6940.50833333327 1514.13492063492

6721.37420634911 1458.70476190476

6809.64325396818 1437.30714285714

7141.22738095231 1430.33571428571

6959.42658730151 1345.67857142857

6800.24404761897 1227.98333333333

6645.10714285707 1365.80000000000

5711.25833333331 1266.90238095238

6803.66666666660 1360.84166666667

6876.13531746024 1378.39444444445

6700.45277777770 1516.30952380952

6898.37142857136 1471.27619047619

6837.44285714278 1464.30833333333

6536.77182539676 1336.68730158730

6437.77738095231 1663.92142857142

6959.89246031738 1742.42857142857

7197.37182539675 1700.77023809523

11150.9714285715 2925.61071428572

13675.8079365082 3738.08452380956

12881.0226551229 3018.21666666668

12700.6178571430 3292.09166666668

10078.4523809524 2990.12539682540

10819.2595238096 2695.37619047620

12449.8393217895 3240.46309523812

12695.5174603177 3610.42023809526

12658.5011904764 4218.86904761907

11566.1234126986 2886.20357142857

11835.4345238096 3457.87261904763

12282.7861111112 4003.42738095240

12499.2980158732 3143.20238095240

13067.7825396828 2947.47380952382

17361.6440476193 4683.19087301587

16396.3762265516 4983.05934343431

9799.70357142859 2442.64642857142

13493.7808913312 3165.64285714287

14058.0031746034 3794.34801587303

15635.6583333337 3957.15833333336

20791.5535714285 4614.47619047621

17946.2821428573 4274.67500000003

17346.6765873019 3931.27857142860

13337.6321789324 3616.66984126986

13361.2107142860 3795.06071428573

16686.1595238100 5899.64642857141

18878.8584776336 5522.68809523805

18934.4007936508 4716.20317460318

20009.7301587302 4424.13055555556

20381.8765873015 4536.62380952380

output=

82972908

96922616

84103424

97286824

110470224

123653624

182202106

146466214

122449749

121151181

107111603.333333

93072025.6666667

79032448

140766458

144450676

116293297

112459059

110613169.900000

108767280.800000

106921391.700000

105075502.600000

103229613.500000

101383724.400000

99537835.3000000

97691946.2000000

95846057.1000000

94000168

79899033

88077108

102502245

132112618

130962789

129812960

128663131

113111266

143178638

168735649

159516279

156681101.666667

153845924.333333

151010747

129905422

171919445

168336377

127469191

137643842.333333

147818493.666667

157993145

144026223

209611260

196492927

224078323

237494191

250910059

264325927

204915487

161587193

130092411

175327600

181051192.333333

186774784.666667

192498377

177986087

164340861

163455241

172848316

175250198.333333

177652080.666667

180053963

179019193

300680350

201753037

174041654

172893581.666667

171745509.333333

170597437

164386318

159160193

132220591

128093888

117646265.333333

107198642.666667

96751020

105672309

143511606

126556494

104654596

108950103.833333

113245611.666667

117541119.500000

121836627.333333

126132135.166667

130427643

121414034

115493112.333333

109572190.666667

103651269

120107023

129849157

168048388

174756932

166799538.666667

158842145.333333

150884752

149213455

180848799

157013367

184872058

189561009.333333

194249960.666667

198938912

213829644

210888138

189508493

148338971

158429083.600000

168519196.200000

178609308.800000

188699421.400000

198789534

166902672

171253481

173463660.666667

175673840.333333

177884020

171429659

162782903

134510072

130636795

137638355.666667

144639916.333333

151641477

134945118

128280127

141881661

145381275

138156700

130932125.000000

123707550

127391706

142044077

135260753

127509607

138062940.666667

148616274.333333

159169608

153871838

140892553

133920422

138107573.500000

142294725

146481876.500000

150669028

112019358

100316380

106859858

116958713

116351498.333333

115744283.666667

115137069

109304380

135302513

126785904

122220306

119220793

116221280.000000

113221767

114161422

102420069

108327382

108591311.750000

108855241.500000

109119171.250000

109383101

99060304

97088679

101409052

109574398

程序:

% Solve an Autoregression Problem with External Input with a NARX Neural Network

% Script generated by NTSTOOL

% Created Mon Aug 27 17:16:23 CST 2012

%

% This script assumes these variables are defined:

%

%

% Load time series.

load test

%   input - input time series.

%   output - feedback time series.

inputSeries = tonndata(input,false,false);

targetSeries = tonndata(output,false,false);

% Create a Nonlinear Autoregressive Network with External Input

inputDelays = 1:3;

feedbackDelays = 1:2;

hiddenLayerSize = 200;

net = narxnet(inputDelays,feedbackDelays,hiddenLayerSize);

% Prepare the Data for Training and Simulation

% The function PREPARETS prepares timeseries data for a particular network,

% shifting time by the minimum amount to fill input states and layer states.

% Using PREPARETS allows you to keep your original time series data unchanged, while

% easily customizing it for networks with differing numbers of delays, with

% open loop or closed loop feedback modes.

[inputs,inputStates,layerStates,targets] = preparets(net,inputSeries,{},targetSeries);

% Setup Division of Data for Training, Validation, Testing

net.divideParam.trainRatio = 60/100;

net.divideParam.valRatio = 20/100;

net.divideParam.testRatio = 20/100;

% Train the Network

[net,tr] = train(net,inputs,targets,inputStates,layerStates);

% Test the Network

outputs = net(inputs,inputStates,layerStates);

errors = gsubtract(targets,outputs);

performance = perform(net,targets,outputs)

% View the Network

view(net)

% Plots

% Uncomment these lines to enable various plots.

%figure, plotperform(tr)

%figure, plottrainstate(tr)

%figure, plotregression(targets,outputs)

%figure, plotresponse(targets,outputs)

%figure, ploterrcorr(errors)

%figure, plotinerrcorr(inputs,errors)

% Closed Loop Network

% Use this network to do multi-step prediction.

% The function CLOSELOOP replaces the feedback input with a direct

% connection from the outout layer.

netc = closeloop(net);

netc.name = [net.name ' - Closed Loop'];

view(netc)

[xc,xic,aic,tc] = preparets(netc,inputSeries,{},targetSeries);

yc = netc(xc,xic,aic);

closedLoopPerformance = perform(netc,tc,yc)

% Early Prediction Network

% For some applications it helps to get the prediction a timestep early.

% The original network returns predicted y(t+1) at the same time it is given y(t+1).

% For some applications such as decision making, it would help to have predicted

% y(t+1) once y(t) is available, but before the actual y(t+1) occurs.

% The network can be made to return its output a timestep early by removing one delay

% so that its minimal tap delay is now 0 instead of 1.  The new network returns the

% same outputs as the original network, but outputs are shifted left one timestep.

nets = removedelay(net);

nets.name = [net.name ' - Predict One Step Ahead'];

view(nets)

[xs,xis,ais,ts] = preparets(nets,inputSeries,{},targetSeries);

ys = nets(xs,xis,ais);

earlyPredictPerformance = perform(nets,ts,ys)

  • 0
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值