关闭

Matlab R2014a @newff函数

207人阅读 评论(0) 收藏 举报
分类:

newff

newff利用的是前馈反向传播算法。
在 R2010b NNET 7.0.版本被废除, Last used in R2010a NNET 6.0.4.

语法

net = newff(P,T,S)
net = newff(P,T,S,TF,BTF,BLF,PF,IPF,OPF,DDF)

参数解释

net = newff(P,T,S)中:

  • P - 含有R个元素的输入向量.
  • T - SN个元素的目标向量.
  • Si - Sizes of N-1 hidden layers, S1 to S(N-1), , 默认为“ []“. (输出层SN取决于 T.)
    最终返回一个N层的前馈backprop 网络

newff(P,T,S,TF,BTF,BLF,PF,IPF,OPF,DDF) :

  • TFi - Transfer function of ith layer. Default is ‘tansig’ for hidden layers, and ‘purelin’ for output layer.
  • BTF - Backprop network training function, default = ‘trainlm’.
  • BLF - Backprop weight/bias learning function, default = ‘learngdm’.
    PF - Performance function, default = ‘mse’.
    IPF - Row cell array of input processing functions.
    Default is {‘fixunknowns’,’remconstantrows’,’mapminmax’}.
    OPF - Row cell array of output processing functions.
    Default is {‘remconstantrows’,’mapminmax’}.
    DDF - Data division function, default = ‘dividerand’;
    and returns an N layer feed-forward backprop network.

    The transfer functions TF{i} can be any differentiable transfer
    function such as TANSIG, LOGSIG, or PURELIN.

    The training function BTF can be any of the backprop training
    functions such as TRAINLM, TRAINBFG, TRAINRP, TRAINGD, etc.

    WARNING: TRAINLM is the default training function because it
    is very fast, but it requires a lot of memory to run. If you get
    an “out-of-memory” error when training try doing one of these:

    (1) Slow TRAINLM training, but reduce memory requirements, by
    setting NET.efficiency.memoryReduction to 2 or more. (See HELP TRAINLM.)
    (2) Use TRAINBFG, which is slower but more memory efficient than TRAINLM.
    (3) Use TRAINRP which is slower but more memory efficient than TRAINBFG.

    The learning function BLF can be either of the backpropagation
    learning functions such as LEARNGD, or LEARNGDM.

    The performance function can be any of the differentiable performance
    functions such as MSE or MSEREG.

    Examples

    [inputs,targets] = simplefitdata;
    net = newff(inputs,targets,20);
    net = train(net,inputs,targets);
    outputs = net(inputs);
    errors = outputs - targets;
    perf = perform(net,outputs,targets)

    Algorithm

    Feed-forward networks consist of Nl layers using the DOTPROD
    weight function, NETSUM net input function, and the specified
    transfer functions.

    The first layer has weights coming from the input. Each subsequent
    layer has a weight coming from the previous layer. All layers
    have biases. The last layer is the network output.

    Each layer’s weights and biases are initialized with INITNW.

    Adaption is done with TRAINS which updates weights with the
    specified learning function. Training is done with the specified
    training function. Performance is measured according to the specified
    performance function.


0
0

查看评论
* 以上用户言论只代表其个人观点,不代表CSDN网站的观点或立场
    个人资料
    • 访问:2380次
    • 积分:74
    • 等级:
    • 排名:千里之外
    • 原创:4篇
    • 转载:1篇
    • 译文:1篇
    • 评论:0条
    文章分类
    文章存档