matlab中zwros,【MATLAB深度学习】神经网络与分类问题

神经网络与分类问题

1.多元分类

根据分类的数量确定输出节点的数量是最可能得到良好效果的方法。输出的类别表示可以使用one-hot编码。通常情况下,二分类使用Sigmoid函数,多元分类使用Softmax函数。Softmax函数不仅考虑输入的加权和,而且考虑其他输出节点的输出。正确地诠释神经网络多元分类的输出结果需要考虑所有节点输出的相对大小。Softmax函数保证输出值之和为1。其也适用于二分类。

多元分类程序示例,输入数据为5个5*5矩阵,分别表示1,2,3,4,5。网络结构为输入节点25个,输出节点5个,隐含节点50个。代码如下:

function [W1, W2] = MultiClass(W1, W2, X, D)

alpha = 0.9;

N = 5;

for k = 1:N

x = reshape(X(:, :, k), 25, 1); % k表示第k幅图,25*1向量

d = D(k, :)‘;

v1 = W1*x;

y1 = Sigmoid(v1);

v = W2*y1;

y = Softmax(v);

e = d - y;

delta = e;

e1 = W2‘*delta;

delta1 = y1.*(1-y1).*e1;

dW1 = alpha*delta1*x‘;

W1 = W1 + dW1;

dW2 = alpha*delta*y1‘;

W2 = W2 + dW2;

end

end

Softmax函数定义如下:

function y = Softmax(x)

ex = exp(x);

y = ex / sum(ex);

end

Sigmoid函数定义如下:

function y = Sigmoid(x)

y = 1 ./ (1 + exp(-x));

end

测试代码如下:

clear all

rng(3);

X = zeros(5, 5, 5);

X(:, :, 1) = [ 0 1 1 0 0;

0 0 1 0 0;

0 0 1 0 0;

0 0 1 0 0;

0 1 1 1 0

];

X(:, :, 2) = [ 1 1 1 1 0;

0 0 0 0 1;

0 1 1 1 0;

1 0 0 0 0;

1 1 1 1 1

];

X(:, :, 3) = [ 1 1 1 1 0;

0 0 0 0 1;

0 1 1 1 0;

0 0 0 0 1;

1 1 1 1 0

];

X(:, :, 4) = [ 0 0 0 1 0;

0 0 1 1 0;

0 1 0 1 0;

1 1 1 1 1;

0 0 0 1 0

];

X(:, :, 5) = [ 1 1 1 1 1;

1 0 0 0 0;

1 1 1 1 0;

0 0 0 0 1;

1 1 1 1 0

];

D = [ 1 0 0 0 0;

0 1 0 0 0;

0 0 1 0 0;

0 0 0 1 0;

0 0 0 0 1

];

W1 = 2*rand(50, 25) - 1;

W2 = 2*rand( 5, 50) - 1;

for epoch = 1:10000 % train

[W1 W2] = MultiClass(W1, W2, X, D);

end

N = 5; % inference

for k = 1:N

x = reshape(X(:, :, k), 25, 1);

v1 = W1*x;

y1 = Sigmoid(v1);

v = W2*y1;

y = Softmax(v)

end

最终得到了正确的分类

2.微污染的多元分类示例

真实数据未必与训练数据相符,用微微污染的数据简单检验一下上面所构建的神经网络。代码如下:

clear all

TestMultiClass; % W1, W2

X = zeros(5, 5, 5);

X(:, :, 1) = [ 0 0 1 1 0;

0 0 1 1 0;

0 1 0 1 0;

0 0 0 1 0;

0 1 1 1 0

];

X(:, :, 2) = [ 1 1 1 1 0;

0 0 0 0 1;

0 1 1 1 0;

1 0 0 0 1;

1 1 1 1 1

];

X(:, :, 3) = [ 1 1 1 1 0;

0 0 0 0 1;

0 1 1 1 0;

1 0 0 0 1;

1 1 1 1 0

];

X(:, :, 4) = [ 0 1 1 1 0;

0 1 0 0 0;

0 1 1 1 0;

0 0 0 1 0;

0 1 1 1 0

];

X(:, :, 5) = [ 0 1 1 1 1;

0 1 0 0 0;

0 1 1 1 0;

0 0 0 1 0;

1 1 1 1 0

];

N = 5; % inference

for k = 1:N

x = reshape(X(:, :, k), 25, 1);

v1 = W1*x;

y1 = Sigmoid(v1);

v = W2*y1;

y = Softmax(v)

end

输出结果为 [0.0208,0.0006,0.0363,0.9164,0.0259] , [0.0000,0.9961,0.0038,0.0000,0.0000] ,[0.0001,0.0198,0.9798,0.0001,0.0002] ,[0.0930,0.3057,0.5397,0.0408,0.0208] ,[0.0363,0.3214,0.0717,0.0199,0.5506]。

以上代码中rng函数为:

function rng(x)

randn(‘seed‘, x)

rand(‘seed‘, x)

end

原文:https://www.cnblogs.com/Negan-ZW/p/9613233.html

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值