matlab divideparam,matlab - Matlab神经网络训练中的开关类 - 堆栈内存溢出

我在matlab中训练了一个神经网络,以使用patternet()进行二进制分类。 我很惊讶地发现,如果我切换类(记住我使用二进制分类,所以只有2个类),我找不到倒置的概率。 这是2个构建的示例,其中的区别仅在于y,其中类的对应项已被反转。

rng('default')

% case 1:

hiddenLayerSize = [10];

trainingFct = 'trainscg'; % possible choice: 'trainscg','trainlm'

errorFcn = 'crossentropy';

net1 = patternnet(hiddenLayerSize,trainingFct,errorFcn);

% Set up Division of Data for Training, Validation, Testing

net1.divideParam.trainRatio = 70/100;

net1.divideParam.valRatio = 15/100;

net1.divideParam.testRatio = 15/100;

% Do not show GUI

net1.trainParam.showWindow = false;

X = [5.1,4.9,4.70,4.60,5,5.40,4.60,5,4.40;...

3.50,3,3.20,3.10,3.60,3.90,3.40,3.40,2.90;...

1.40,1.40,1.30,1.50,1.40,1.70,1.40,1.50,1.40;...

0.200,0.200,0.200,0.200,0.200,0.400,0.300,0.200,0.200];

y_1 = [1,1,0,0,0,1,0,1,0;0,0,1,1,1,0,1,0,1];

[net1,tr1] = train(net1,X,y_1);

proba1 = net1(X(:,1))

% case 2:

hiddenLayerSize = [10];

trainingFct = 'trainscg'; % possible choice: 'trainscg','trainlm'

errorFcn = 'crossentropy';

net2 = patternnet(hiddenLayerSize,trainingFct,errorFcn);

% Set up Division of Data for Training, Validation, Testing

net2.divideParam.trainRatio = 70/100;

net2.divideParam.valRatio = 15/100;

net2.divideParam.testRatio = 15/100;

% Do not show GUI

net2.trainParam.showWindow = false;

X = [5.1,4.9,4.70,4.60,5,5.40,4.60,5,4.40;...

3.50,3,3.20,3.10,3.60,3.90,3.40,3.40,2.90;...

1.40,1.40,1.30,1.50,1.40,1.70,1.40,1.50,1.40;...

0.200,0.200,0.200,0.200,0.200,0.400,0.300,0.200,0.200];

y_2 = [0,0,1,1,1,0,1,0,1;1,1,0,0,0,1,0,1,0];

[net2,tr2] = train(net2,X,y_2);

proba = net2(X(:,1))

运行此代码(使用rng('default')获得可重现性)

proba1 =

0.1401

0.8599

proba2 =

0.0000

1.0000

看看这两种情况下的概率,我很惊讶地发现这些概率不仅是相互渗透的,而且是完全不同的。 对此行为有任何建议或原因吗?

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值