[Pn,ps]=mapminmax(P)
Pn =
-1.0000 -0.9007 -0.7955 -0.6568 -0.4551 -0.2711 -0.0440 0.2123 0.5153 1.0000
-1.0000 -0.8773 -0.6534 -0.5307 -0.2455 -0.0108 0.3935 0.6534 0.9025 1.0000
ps =
name: 'mapminmax'
xrows: 2
xmax: [2x1 double]
xmin: [2x1 double]
xrange: [2x1 double]
yrows: 2
ymax: 1
ymin: -1
yrange: 2
no_change: 0
gain: [2x1 double]
xoffset: [2x1 double]
>> [Tn,ts]=mapminmax(T)
Tn =
-1.0000 -0.9348 -0.8913 -0.8043 -0.6087 -0.3696 -0.1087 0.1304 0.5217 1.0000
ts =
name: 'mapminmax'
xrows: 1
xmax: 16.1000
xmin: 6.9000
xrange: 9.2000
yrows: 1
ymax: 1
ymin: -1
yrange: 2
no_change: 0
gain: 0.2174
xoffset: 6.9000
net=newff(Pn,Tn,9,{'tansig','purelin'},'trainlm')
net =
Neural Network
name: 'Custom Neural Network'
userdata: (your custom info)
dimensions:
numInputs: 1
numLayers: 2
numOutputs: 1
numInputDelays: 0
numLayerDelays: 0
numFeedbackDelays: 0
numWeightElements: 37
sampleTime: 1
connections:
biasConnect: [1; 1]
inputConnect: [1; 0]
layerConnect: [0 0; 1 0]
outputConnect: [0 1]
subobjects:
input: Equivalent to inputs{1}
output: Equivalent to outputs{2}
inputs: {1x1 cell array of 1 input}
layers: {2x1 cell array of 2 layers}
outputs: {1x2 cell array of 1 output}
biases: {2x1 cell array of 2 biases}
inputWeights: {2x1 cell array of 1 weight}
layerWeights: {2x2 cell array of 1 weight}
functions:
adaptFcn: 'adaptwb'
adaptParam: (none)
derivFcn: 'defaultderiv'
divideFcn: 'dividerand'
divideParam: .trainRatio, .valRatio, .testRatio
divideMode: 'sample'
initFcn: 'initlay'
performFcn: 'mse'
performParam: .regularization, .normalization
plotFcns: {'plotperform', plottrainstate,
plotregression}
plotParams: {1x3 cell array of 3 params}
trainFcn: 'trainlm'
trainParam: .showWindow, .showCommandLine, .show, .epochs,
.time, .goal, .min_grad, .max_fail, .mu, .mu_dec,
.mu_inc, .mu_max
weight and bias values:
IW: {2x1 cell} containing 1 input weight matrix
LW: {2x2 cell} containing 1 layer weight matrix
b: {2x1 cell} containing 2 bias vectors
methods:
adapt: Learn while in continuous use
configure: Configure inputs & outputs
gensim: Generate Simulink model
init: Initialize weights & biases
perform: Calculate performance
sim: Evaluate network outputs given inputs
train: Train network with examples
view: View diagram
unconfigure: Unconfigure inputs & outputs
evaluate: outputs = net(inputs)
net.trainParam.show=50;
net.trainParam.lr=0.01;
net.trainParam.mc=0.9;
net.trainParam.epochs=100000;
net.trainParam.goal=0.01;
>> [net,tr]=train(net,Pn,Tn)
net =
Neural Network
name: 'Custom Neural Network'
userdata: (your custom info)
dimensions:
numInputs: 1
numLayers: 2
numOutputs: 1
numInputDelays: 0
numLayerDelays: 0
numFeedbackDelays: 0
numWeightElements: 37
sampleTime: 1
connections:
biasConnect: [1; 1]
inputConnect: [1; 0]
layerConnect: [0 0; 1 0]
outputConnect: [0 1]
subobjects:
input: Equivalent to inputs{1}
output: Equivalent to outputs{2}
inputs: {1x1 cell array of 1 input}
layers: {2x1 cell array of 2 layers}
outputs: {1x2 cell array of 1 output}
biases: {2x1 cell array of 2 biases}
inputWeights: {2x1 cell array of 1 weight}
layerWeights: {2x2 cell array of 1 weight}
functions:
adaptFcn: 'adaptwb'
adaptParam: (none)
derivFcn: 'defaultderiv'
divideFcn: 'dividerand'
divideParam: .trainRatio, .valRatio, .testRatio
divideMode: 'sample'
initFcn: 'initlay'
performFcn: 'mse'
performParam: .regularization, .normalization
plotFcns: {'plotperform', plottrainstate,
plotregression}
plotParams: {1x3 cell array of 3 params}
trainFcn: 'trainlm'
trainParam: .showWindow, .showCommandLine, .show, .epochs,
.time, .goal, .min_grad, .max_fail, .mu, .mu_dec,
.mu_inc, .mu_max, .lr, .mc
weight and bias values:
IW: {2x1 cell} containing 1 input weight matrix
LW: {2x2 cell} containing 1 layer weight matrix
b: {2x1 cell} containing 2 bias vectors
methods:
adapt: Learn while in continuous use
configure: Configure inputs & outputs
gensim: Generate Simulink model
init: Initialize weights & biases
perform: Calculate performance
sim: Evaluate network outputs given inputs
train: Train network with examples
view: View diagram
unconfigure: Unconfigure inputs & outputs
evaluate: outputs = net(inputs)
tr =
trainFcn: 'trainlm'
trainParam: [1x1 struct]
performFcn: 'mse'
performParam: [1x1 struct]
derivFcn: 'defaultderiv'
divideFcn: 'dividerand'
divideMode: 'sample'
divideParam: [1x1 struct]
trainInd: [1 2 4 5 6 7 9 10]
valInd: 8
testInd: 3
stop: 'Performance goal met.'
num_epochs: 3
trainMask: {[1 1 NaN 1 1 1 1 NaN 1 1]}
valMask: {[NaN NaN NaN NaN NaN NaN NaN 1 NaN NaN]}
testMask: {[NaN NaN 1 NaN NaN NaN NaN NaN NaN NaN]}
best_epoch: 2
goal: 0.0100
states: {'epoch' 'time' 'perf' 'vperf' 'tperf' 'mu' 'gradient' 'val_fail'}
epoch: [0 1 2 3]
time: [0.3110 0.4370 0.4510 0.4580]
perf: [4.8979 0.0847 0.0106 6.2624e-04]
vperf: [1.2198 0.6976 9.1008e-05 0.0354]
tperf: [1.5746 0.0850 6.7547e-06 8.7878e-06]
mu: [1.0000e-03 1.0000e-04 1.0000e-05 1.0000e-06]
gradient: [8.3152 0.8621 0.2466 0.0739]
val_fail: [0 0 0 1]
best_perf: 0.0106
best_vperf: 9.1008e-05
best_tperf: 6.7547e-06