matlab里newff,matlab里面的newff函数怎么回事

该楼层疑似违规已被系统折叠 隐藏此楼查看此楼

function net = newff(varargin)

%NEWFF Create a feed-forward backpropagation network.

%

% Syntax

%

% net = newff(P,T,S,TF,BTF,BLF,PF,IPF,OPF,DDF)

%

% Description

%

% NEWFF(P,T,S,TF,BTF,BLF,PF,IPF,OPF,DDF) takes,

% P - RxQ1 matrix of Q1 representative R-element input vectors.

% T - SNxQ2 matrix of Q2 representative SN-element target vectors.

% Si - Sizes of N-1 hidden layers, S1 to S(N-1), default = [].

% (Output layer size SN is determined from T.)

% TFi - Transfer function of ith layer. Default is 'tansig' for

% hidden layers, and 'purelin' for output layer.

% BTF - Backprop network training function, default = 'trainlm'.

% BLF - Backprop weight/bias learning function, default = 'learngdm'.

% PF - Performance function, default = 'mse'.

% IPF - Row cell array of input processing functions.

% Default is {'fixunknowns','remconstantrows','mapminmax'}.

% OPF - Row cell array of output processing functions.

% Default is {'remconstantrows','mapminmax'}.

% DDF - Data division function, default = 'dividerand';

% and returns an N layer feed-forward backprop network.

%

% The transfer functions TF{i} can be any differentiable transfer

% function such as TANSIG, LOGSIG, or PURELIN.

%

% The training function BTF can be any of the backprop training

% functions such as TRAINLM, TRAINBFG, TRAINRP, TRAINGD, etc.

%

% *WARNING*: TRAINLM is the default training function because it

% is very fast, but it requires a lot of memory to run. If you get

% an "out-of-memory" error when training try doing one of these:

%

% (1) Slow TRAINLM training, but reduce memory requirements, by

% setting NET.trainParam.mem_reduc to 2 or more. (See HELP TRAINLM.)

% (2) Use TRAINBFG, which is slower but more memory efficient than TRAINLM.

% (3) Use TRAINRP which is slower but more memory efficient than TRAINBFG.

%

% The learning function BLF can be either of the backpropagation

% learning functions such as LEARNGD, or LEARNGDM.

%

% The performance function can be any of the differentiable performance

% functions such as MSE or MSEREG.

%

% Examples

%

% load simplefit_dataset

% net = newff(simplefitInputs,simplefitTargets,20);

% net = train(net,simplefitInputs,simplefitTargets);

% simplefitOutputs = sim(net,simplefitInputs);

%

% Algorithm

%

% Feed-forward networks consist of Nl layers using the DOTPROD

% weight function, NETSUM net input function, and the specified

% transfer functions.

%

% The first layer has weights coming from the input. Each subsequent

% layer has a weight coming from the previous layer. All layers

% have biases. The last layer is the network output.

%

% Each layer's weights and biases are initialized with INITNW.

%

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值