matlab net perform,CNN Matlab版 学习笔记(三): Perform Transfer Learning to fine-tune a network with your d...

Fine-tune a pretrained convolutional neural network to learn the

features on a new collection of images.

Transfer learning is commonly used in deep learning applications.

You can take a pretrained network and use it as a starting point to

learn a new task. Fine-tuning a network with transfer learning is

much faster and easier than training from scratch. You can quickly

transfer learning to a new task using a smaller number of training

images.

Step1:与博客CNN

Matlab版学习笔记(二): Feature Extraction using CNN中的Step1一致。

但是也是要提一点非常要注意的事情:If

the training images differ in size from the image input layer, then

you must resize or crop the image data. The images inmerchImagesare

the same size as the input size of AlexNet, so you do not need to

resize or crop the new image data.

所有的图像数据的尺寸,都应该与Alexnet的训练时所用的图像的尺寸一致(227*227*3)

Step2: 下载Pretrained AlexNet

Network(Load

a pretrained AlexNet network)

net = alexnet;

Step3: 构建新网络各层:其中Transfering 1:end-3的网络,新建后三层网络用于fine-tuned。

The last three layers of the pretrained networknetare configured for 1000 classes.

(整个AlexNet网络构造起来是为了识别1000类物体的,但是,一般对于某一应用来说,只是识别几种物体)

These three layers must be fine-tuned for the new classification problem.

(因此,对后三层fine-tuned来进行识别几种物体的识别问题。)

Extract all the layers except the last three from the pretrained network,net

(提取除后三层的所有层,进行transfer.)

代码如下:

layersTransfer = net.Layers(1:end-3);

(Transfer the layers to the new task by replacing the last three layers with

a fully connected layer, a softmax layer, and a classification output layer.

Specify the options of the new fully connected layer according to the new data.

Set the fully connected layer to be of the same size as the number of classes in the new data.

To speed up training, also increase'WeightLearnRateFactor'and'BiasLearnRateFactor'values in the fully connected layer.)

(用a fully connected layer, a softmax layer, and a classification output layer. 的三层替代最后三层;

要注意的一点是,the new fully connected layer 的尺寸要与新数据的类别一致。)

为了加速运算,可以将'WeightLearnRateFactor'and'BiasLearnRateFactor'values 的值调大些

Determine the number of classes from the training data.

(为了能够保证the new fully connected layer 的尺寸与新数据的类别一致。首先要确定训练集的类别数量。)

代码如下:

numClasses = numel(categories(merchImagesTrain.Labels))

numClasses =

5

Create the layer array by combining the transferred layers with the new layers.

(创建新的网络,由三层修改层和以前的1:end-3层构成,fine-tuning (transfering learning)主要就是学习后三层);

代码如下:

layers = [...

layersTransfer

fullyConnectedLayer(numClasses,'WeightLearnRateFactor',20,'BiasLearnRateFactor',20)

softmaxLayer

classificationLayer];

对代码的解释:layersTransfer以前Alexnet的1:end-3层,fullyConnectedLayer表示修改的,也是重点要修改的,可以调整参数。

Step4: 开始采用建立的新网络开始训练网络

首先要创建选项参数:

(Create the training options.

For transfer learning, you want to keep the features from the early layers of the pretrained network (the transferred layer weights).

Set'InitialLearnRate'to a low value. This low initial learn rate slows down learning on the transferred layers.

参数选取1:InitialLearnRate:为了保持我们所获取的特征是有从pretrained Network的层中而来,应该设置InitialLearnRate的值较低;

这个值越低能够减缓在transferred Layer层学习,也就是说只在the early layers学习。

In the previous step, you set the learn rate factors for the fully connected layer higher to speed up learning on the new final layers.

参数选取2:'WeightLearnRateFactor'和'BiasLearnRateFactor',在这里不构建,但是再次说明一下,是为了加速在the new final layers的学习。

它们的值越大,学习速度越快。

This combination results in fast learning only on the new layers, while keeping the other layers fixed.

(上述两种参数的这样的设置,保证了快速的在the new layers的学习,而保持了其他层的固定。)

参数选取3:MaxEpochs;当做transfer learning时,你不需要做太多更新,为了加速训练,你可以减少MaxEpochs

的值

When performing transfer learning, you do not need to train for as many epochs.

To speed up training, you can reduce the value of the'MaxEpochs'name-value pair argument in the call totrainingOptions.

参数选取4:MiniBatchSize' 为了减少内存,减少MiniBatchSize。

To reduce memory usage, reduce'MiniBatchSize'.)

代码如下:

评价:计算精度:Calculate the classification accuracy.

代码如下:

testLabels = merchImagesTest.Labels;

accuracy = sum(predictedLabels==testLabels)/numel(predictedLabels)

accuracy =

0.9333

This example has high accuracy.

If the accuracy is not high enough using transfer learning, try feature extraction instead.

(如果认为精度不高,请使用特征提取,之后再进行SVM或者其他分类器。)

options = trainingOptions('sgdm',...

'MiniBatchSize',5,...

'MaxEpochs',10,...

'InitialLearnRate',0.0001);

其次,开始训练 Fine-tune the network usingtrainNetworkon the new layer array.

代码如下:

netTransfer = trainNetwork(merchImagesTrain,layers,options);

结果:

Training on single CPU.

Initializing image normalization.

|=========================================================================================|

| Epoch | Iteration | Time Elapsed | Mini-batch | Mini-batch | Base Learning|

| | | (seconds) | Loss | Accuracy | Rate |

|=========================================================================================|

| 1 | 1 | 36.70 | 1.4886 | 60.00% | 1.00e-04 |

| 5 | 50 | 1687.54 | -0.0000 | 100.00% | 1.00e-04 |

| 9 | 100 | 3394.59 | -0.0000 | 100.00% | 1.00e-04 |

| 10 | 120 | 4049.95 | -0.0000 | 100.00% | 1.00e-04 |

|=========================================================================================|

由于我在笔记本中运行,因此时间较长,一般不会这么长时间的。

Step5:预测与评价:

展示预测结果的代码:

idx = [1 4 7 10];

figure

for i = 1:numel(idx)

subplot(2,2,i)

I = readimage(merchImagesTest,idx(i));

label = predictedLabels(idx(i));

imshow(I)

title(char(label))

drawnow

end

predictedLabels = classify(netTransfer,merchImagesTest);

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值