神经网络-感知机(8)[matlab]

可以使用硬限幅

>> test

e =

       10000


e =

    -1     0    -1    -1    -1


Wij =

   -1.0428   -1.5146


b =

    -1     0    -1    -1    -1


e =

     0     1     0     0     0


Wij =

   -0.0428   -0.5146


b =

    -1     1    -1    -1    -1


e =

     0     0     0     0     0


Wij =

   -0.0428   -0.5146


b =

    -1     1    -1    -1    -1


net =

   -1.5146    0.4425   -1.5146   -1.0428   -1.0428


y =

     0     1     0     0     0

%感知机的学习过程,and运算 
P=[0 1 0 1 1;1 1 1 0 0];
T=[0 1 0 0 0];
[M,N]=size(P);
[L,N]=size(T);
%权值矩阵
Wij=rand(L,M);
%阈值矩阵
b=zeros(L,1);
e=10000
while (mae(e)>0.0015)
   net=netsum(Wij*P,b);
   y=hardlim(net);
   e=T-y
   Wij=Wij+e*P'
   b=b+e 
end
net=netsum(Wij*P,b)
y=hardlim(net)

 

 

 

 

 

 help netsum
 NETSUM Sum net input function.
 
  Syntax
 
    N = netsum({Z1,Z2,...,Zn},FP)
    dN_dZj = netsum('dz',j,Z,N,FP)
    INFO = netsum(CODE)
 
  Description
 
    NETSUM is a net input function.  Net input functions calculate
    a layer's net input by combining its weighted inputs and bias.
 
    NETSUM({Z1,Z2,...,Zn},FP) takes Z1-Zn and optional function parameters,
      Zi - SxQ matrices in a row cell array.
      FP - Row cell array of function parameters (ignored).
    Returns element-wise sum of Z1 to Zn.
 
    NETSUM('dz',j,{Z1,...,Zn},N,FP) returns the derivative of N with
    respect to Zj.  If FP is not supplied the default values are used.
    if N is not supplied, or is [], it is calculated for you.
 
    NETSUM('name') returns the name of this function.
    NETSUM('type') returns the type of this function.
    NETSUM('fpnames') returns the names of the function parameters.
    NETSUM('fpdefaults') returns default function parameter values.
    NETSUM('fpcheck',FP) throws an error for illegal function parameters.
    NETSUM('fullderiv') returns 0 or 1, if the derivate is SxQ or NxSxQ.
 
  Examples
 
    Here NETSUM combines two sets of weighted input vectors and a bias.
    We must use CONCUR to make B the same dimensions as Z1 and Z2.
 
      z1 = [1 2 4; 3 4 1]
      z2 = [-1 2 2; -5 -6 1]
      b = [0; -1]
      n = netsum({z1,z2,concur(b,3)})
 
    Here we assign this net input function to layer i of a network.
 
      net.layers{i}.netFcn = 'compet';
 
    Use NEWP or NEWLIN to create a standard network that uses NETSUM.
 
 hardlim通过计算网络的输入得到该层的输出,如果网络的输入达到门限,则输出1,否则输出0,

配合netsum函数,可以构造感知机的学习过程

 help hardlim
 HARDLIM Hard limit transfer function.
  
  Syntax
 
    A = hardlim(N,FP)
    dA_dN = hardlim('dn',N,A,FP)
    INFO = hardlim(CODE)
 
  Description
 
    HARDLIM is a neural transfer function.  Transfer functions
    calculate a layer's output from its net input.
 
    HARDLIM(N,FP) takes N and optional function parameters,
      N - SxQ matrix of net input (column) vectors.
      FP - Struct of function parameters (ignored).
    and returns A, the SxQ boolean matrix with 1's where N >= 0.
  
    HARDLIM('dn',N,A,FP) returns SxQ derivative of A w-respect to N.
    If A or FP are not supplied or are set to [], FP reverts to
    the default parameters, and A is calculated from N.
 
    HARDLIM('name') returns the name of this function.
    HARDLIM('output',FP) returns the [min max] output range.
    HARDLIM('active',FP) returns the [min max] active input range.
    HARDLIM('fullderiv') returns 1 or 0, whether DA_DN is SxSxQ or SxQ.
    HARDLIM('fpnames') returns the names of the function parameters.
    HARDLIM('fpdefaults') returns the default function parameters.
  
  Examples
 
    Here is how to create a plot of the HARDLIM transfer function.
  
      n = -5:0.1:5;
      a = hardlim(n);
      plot(n,a)
 
    Here we assign this transfer function to layer i of a network.
 
      net.layers{i}.transferFcn = 'hardlim';
 
  Algorithm
 
      hardlim(n) = 1, if n >= 0
                   0, otherwise

 

 

>> help hardlims
 HARDLIMS Symmetric hard limit transfer function.
  
  Syntax
 
    A = hardlims(N,FP)
    dA_dN = hardlims('dn',N,A,FP)
    INFO = hardlims(CODE)
 
  Description
  
    HARDLIMS is a neural transfer function.  Transfer functions
    calculate a layer's output from its net input.
 
    HARDLIMS(N,FP) takes N and optional function parameters,
      N - SxQ matrix of net input (column) vectors.
      FP - Struct of function parameters (ignored).
    and returns A, the SxQ +1/-1 matrix with +1's where N >= 0.
  
    HARDLIMS('dn',N,A,FP) returns SxQ derivative of A w-respect to N.
    If A or FP are not supplied or are set to [], FP reverts to
    the default parameters, and A is calculated from N.
 
    HARDLIMS('name') returns the name of this function.
    HARDLIMS('output',FP) returns the [min max] output range.
    HARDLIMS('active',FP) returns the [min max] active input range.
    HARDLIMS('fullderiv') returns 1 or 0, whether DA_DN is SxSxQ or SxQ.
    HARDLIMS('fpnames') returns the names of the function parameters.
    HARDLIMS('fpdefaults') returns the default function parameters.
  
  Examples
 
    Here is how to create a plot of the HARDLIMS transfer function.
  
      n = -5:0.1:5;
      a = hardlims(n);
      plot(n,a)
 
    Here we assign this transfer function to layer i of a network.
 
      net.layers{i}.transferFcn = 'hardlims';
 
  Algorithm
 
      hardlims(n) = 1, if n >= 0
                   -1, otherwise

 

hardlims达到门限输出为1,否则输出-1

>> a=[-5:0.5:5]

a =

  Columns 1 through 6

   -5.0000   -4.5000   -4.0000   -3.5000   -3.0000   -2.5000

  Columns 7 through 12

   -2.0000   -1.5000   -1.0000   -0.5000         0    0.5000

  Columns 13 through 18

    1.0000    1.5000    2.0000    2.5000    3.0000    3.5000

  Columns 19 through 21

    4.0000    4.5000    5.0000

>> c=hardlim(a)

c =

  Columns 1 through 11

     0     0     0     0     0     0     0     0     0     0     1

  Columns 12 through 21

     1     1     1     1     1     1     1     1     1     1

>> d=hardlims(a)

d =

  Columns 1 through 11

    -1    -1    -1    -1    -1    -1    -1    -1    -1    -1     1

  Columns 12 through 21

     1     1     1     1     1     1     1     1     1     1

>>

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值