【读书1】【2017】MATLAB与深度学习——SGD方法的实现(1)

通过对训练输出与正确输出的比较,我们可以看出神经网络是如何进行训练的。

We can see how well the neural network wastrained by comparing the output with the correct output.

clear all
X = [ 0 0 1;
0 1 1;
1 0 1;
1 1 1;
];
D = [ 0 0 1 1 ];

W = 2*rand(1, 3) - 1; % 随机数初始化权值
for epoch = 1:10000 % train

W = DeltaSGD(W,X, D);
end
N = 4; % inference
for k = 1:N

x = X(k, ?’;

v = W*x;

y = Sigmoid(v)
end

该代码用-1到1之间的随机实数初始化权值。

This code initializes the weights withrandom real numbers between -1 and 1.

执行此代码会输出以下结果。

Executing this code produces the followingvalues.

这些输出值与D中的正确输出非常接近。

These output values are very close to thecorrect outputs in D.

因此,我们可以得出结论,神经网络已经得到适当的训练。

Therefore, we can conclude that the neuralnetwork has been properly trained.

在这里插入图片描述

本书中的每一个示例代码都是在单独的文件中实现算法和测试程序。

Every example code in this book consists ofthe implementation of the algorithm and the test program in separate files.

这是因为将它们放在一起往往会使代码更加复杂,从而阻碍我们对算法的有效分析。

This is because putting them together oftenmakes the code more complicated and hampers efficient analysis of thealgorithm.

测试程序的文件名以Test开头,紧跟着的是算法文件名。

The file name of the test program startswith Test and is followed by the name on the algorithm file.

算法文件是根据函数名称命名的,符合MATLAB的命名约定。

The algorithm file is named after thefunction name, in compliance with the naming convention of MATLAB.

例如,DeltaSGD函数的实现文件名称为DeltaSGD.m。

For example, the implementation file of theDeltaSGD function is named DeltaSGD.m.

算法实现命名举例:DeltaSGD.m

Algorithm implementation example/DeltaSGD.m

测试程序命名举例:TestDeltaSGD.m

Test program example/ TestDeltaSGD.m

批处理方法的实现(Implementation of the Batch Method)

函数DeltaBatch使用批处理方法实现方程2.7的增量规则。

The function DeltaBatch implements thedelta rule of Equation 2.7 using the batch method.

它采用神经网络的权值和训练数据,并返回训练后的权值。

It takes the weights and training data ofthe neural network and returns trained weights.

W = DeltaBatch(W, X, D)

在这个函数的定义中,所有变量与函数DeltaSGD中的变量具有相同的含义;W是神经网络的权重,X和D分别是训练数据的输入和正确输出。

In this function definition, the variablescarry the same meaning as those in the function DeltaSGD; W is the weight ofthe neural network, X and D are the input and correct output of the trainingdata, respectively.

下面给出了实现函数DeltaBatch的DeltaBatch.m文件。

The following listing shows theDeltaBatch.m file, which implements the function DeltaBatch.

function W = DeltaBatch(W, X, D)

alpha = 0.9;

dWsum = zeros(3,1);

N = 4;

for k = 1:N

x = X(k, ?’;
d = D(k);
v = Wx;
y = Sigmoid(v);
e = d - y;
delta = y
(1-y)e;
dW = alpha
delta*x;
dWsum = dWsum + dW;

end
dWavg = dWsum / N;

W(1) = W(1) + dWavg(1);
W(2) = W(2) + dWavg(2);
W(3) = W(3) + dWavg(3);
end

——本文译自Phil Kim所著的《Matlab Deep Learning》

更多精彩文章请关注微信号:在这里插入图片描述

  • 2
    点赞
  • 2
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值