4 transfer training in matlab

Typical workflow for transfer learning

To perform transfer learning, you need to create three components:

  1. An array of layers representing the network architecture. For transfer learning, this is created by modifying a preexisting network such as AlexNet.

  2. Images with known labels to be used as training data. This is typically provided as a datastore.

  3. A variable containing the options that control the behavior of the training algorithm.

These three components are provided as the inputs to the trainNetwork function which returns the trained network as output.

 

You should test the performance of the newly trained network. If it is not adequate, typically you should try adjusting some of the training options and retraining.

Labeling training images

 

When training a network, you need to provide known labels for the training images. The Flowers folder contains 12 subfolders, each of which contains 80 images of one type of flower. The name of the folder can therefore be used to provide the labels needed for training.


Label Images in a Datastore

Instructions are in the task pane to the left. Complete and submit each task one at a time.

 

This code creates a datastore of 960 flower images.

load pathToImages

flwrds = imageDatastore(pathToImages,'IncludeSubfolders',true);

flowernames = flwrds.Labels

 

Task 1

Create datastore with labels

flwrds=imageDatastore(pathToImages,'IncludeSubfolders',true,'LabelSource','foldernames')

 

Task 2

Extract new labels

flowernames=flwrds.Labels

4.3 Preparing Training Data: (2/3) Split data for training and testing

Split Data for Training and Testing

Instructions are in the task pane to the left. Complete and submit each task one at a time.

 

This code creates a datastore of 960 flower images.

load pathToImages

flwrds = imageDatastore(pathToImages,'IncludeSubfolders',true,'LabelSource','foldernames')

 

Task 1

Split datastore

[flwrTrain, flwrtest]=splitEachLabel(flwrds,0.6)

[ds1,ds2] = splitEachLabel(imds,p)

Task 2

Split datastore randomly

[flwrTrain, flwrTest]=splitEachLabel(flwrds,0.8,'randomized')

When p is a value from 0 to 1, it is interpreted as a proportion. The images are then split so that each label is split proportionally. You can also specify an exact number of files to take from each label to assign to ds1.

[ds1,ds2] = splitEachLabel(imds,n)


This ensures that every label in ds1 has n images, even if the categories do not all contain the same number of images.

 

When p is a value from 0 to 1, it is interpreted as a proportion. The images are then split so that each label is split proportionally. You can also specify an exact number of files to take from each label to assign to ds1.

[ds1,ds2] = splitEachLabel(imds,n)


This ensures that every label in ds1 has n images, even if the categories do not all contain the same number of images.

Task

Split the datastore flwrds into two datastores flwrTrain and flwrTest, such that 50 files in each category are in flwrTrain.

Task 3

Split datastore by number of images

[flwrTrain,flwrTest]=splitEachLabel(flwrds,50)

Modify Network Layers

Instructions are in the task pane to the left. Complete and submit each task one at a time.

 

This code imports AlexNet and extracts its layers.

anet = alexnet;

layers = anet.Layers

 

Task 1

Create new layer

fc=fullyConnectedLayer(12)

 

Task 2

Replace 23rd layer

layers(23)=fc

 

Task 3

Replace last layer

layers(end)=classificationLayer

Set Training Options

Instructions are in the task pane to the left. Complete and submit each task one at a time.

 

Task 1

Set default options

opts=trainingOptions('sgdm')

 

Task 2

Set initial learning rate

opts=trainingOptions('sgdm','InitialLearnRate',0.001)

 

 

 

  • 0
    点赞
  • 1
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值