matlab 前向神经网络
注:另参考其他网络 fitnet(拟合),patternnet(模式识别)
feedforwardnet
net=feedforwardnet;
同通用语法
feedforwardnet(hiddenSizes,trainFcn)
hiddenSizes为向量,定义多个隐藏层中神经元的数量
trainFcn训练函数,默认trainlm
info
dimensions:
numInputs: 1 %一个输入,(一个并发输入)
numLayers: 2 %两层,一个隐藏层,一个输出层
numOutputs: 1 %一个输出层
%numInputDelays: 0
%numLayerDelays: 0
%numFeedbackDelays: 0
%numWeightElements: 10
%sampleTime: 1
connections:
biasConnect: [1; 1] %向量,两层,均有偏置
inputConnect: [1; 0] %向量,仅输入与第一层有连接
layerConnect: [0 0; 1 0] %矩阵,行为目标层,列为出发层,仅1连接到2层
outputConnect: [0 1] %向量,决定输出层
subobjects:
input: Equivalent to inputs{1}
output: Equivalent to outputs{2}
%全是对象(结构体) 元胞数组
inputs: {1x1 cell array of 1 input}
layers: {2x1 cell array of 2 layers}
outputs: {1x2 cell array of 1 output}
biases: {2x1 cell array of 2 biases}
inputWeights: {2x1 cell array of 1 weight}
layerWeights: {2x2 cell array of 1 weight}
functions:
adaptFcn: 'adaptwb'
adaptParam: (none)
derivFcn: 'defaultderiv'
divideFcn: 'dividerand'
divideParam: .trainRatio, .valRatio, .testRatio
divideMode: 'sample'
initFcn: 'initlay'
performFcn: 'mse'
performParam: .regularization, .normalization
plotFcns: {'plotperform', plottrainstate, ploterrhist,
plotregression}
plotParams: {1x4 cell array of 4 params}
trainFcn: 'trainlm' %训练函数
trainParam: .showWindow, .showCommandLine, .show, .epochs,
.time, .goal, .min_grad, .max_fail, .mu, .mu_dec,
.mu_inc, .mu_max
weight and bias values:
IW: {2x1 cell} containing 1 input weight matrix
LW: {2x2 cell} containing 1 layer weight matrix
b: {2x1 cell} containing 2 bias vectors
methods:
adapt: Learn while in continuous use
configure: Configure inputs & outputs
gensim: Generate Simulink model
init: Initialize weights & biases
perform: Calculate performance
sim: Evaluate network outputs given inputs
train: Train network with examples
view: View diagram
unconfigure: Unconfigure inputs & outputs
evaluate: outputs = net(inputs)
layers
>>net.layers{1}
name: 'Hidden' %自定义的可改
dimensions: 10 %与size的区别没懂
distanceFcn: (none)
distanceParam: (none)
distances: []
initFcn: 'initnw'
netInputFcn: 'netsum' %层函数,加上偏置
netInputParam: (none)
positions: []
range: [10x2 double]
size: 10 %默认神经元数,net.layers{i}.size
topologyFcn: (none)
transferFcn: 'tansig' %转换函数,net.layers{i}.transferFcn
transferParam: (none)
userdata: (your custom info)
layersweight
这与info是对应的layerConnect: [0 0; 1 0] %矩阵,行为目标层,列为出发层,仅1连接到2层
>>net.layerWeights{2,1}
delays: 0
initFcn: (none)
initSettings: .range
learn: true
learnFcn: 'learngdm'
learnParam: .lr, .mc
size: [0 10]
weightFcn: 'dotprod' %权重函数
weightParam: (none)
userdata: (your custom info)