tensorflow & keras fine tune

首先要说明一下:keras是tf高层次的封装,所以能在tf上用的,就可以在keras中用


1. keras上fine tune:

在官网上有例子:https://keras.io/applications

from keras.applications.inception_v3 import InceptionV3
from keras.preprocessing import image
from keras.models import Model
from keras.layers import Dense, GlobalAveragePooling2D
from keras import backend as K

# create the base pre-trained model
base_model = InceptionV3(weights='imagenet', include_top=False)

# add a global spatial average pooling layer
x = base_model.output
x = GlobalAveragePooling2D()(x)
# let's add a fully-connected layer
x = Dense(1024, activation='relu')(x)
# and a logistic layer -- let's say we have 200 classes
predictions = Dense(200, activation='softmax')(x)

# this is the model we will train
model = Model(inputs=base_model.input, outputs=predictions)

# first: train only the top layers (which were randomly initialized)
# i.e. freeze all convolutional InceptionV3 layers
for layer in base_model.layers:
    layer.trainable = False

# compile the model (should be done *after* setting layers to non-trainable)
model.compile(optimizer='rmsprop', loss='categorical_crossentropy')

# train the model on the new data for a few epochs
model.fit_generator(...)

# at this point, the top layers are well trained and we can start fine-tuning
# convolutional layers from inception V3. We will freeze the bottom N layers
# and train the remaining top layers.

# let's visualize layer names and layer indices to see how many layers
# we should freeze:
for i, layer in enumerate(base_model.layers):
   print(i, layer.name)

# we chose to train the top 2 inception blocks, i.e. we will freeze
# the first 249 layers and unfreeze the rest:
for layer in model.layers[:249]:
   layer.trainable = False
for layer in model.layers[249:]:
   layer.trainable = True

# we need to recompile the model for these modifications to take effect
# we use SGD with a low learning rate
from keras.optimizers import SGD
model.compile(optimizer=SGD(lr=0.0001, momentum=0.9), loss='categorical_crossentropy')

# we train our model again (this time fine-tuning the top 2 inception blocks
# alongside the top Dense layers
model.fit_generator(...)

各种model的定义:https://github.com/fchollet/deep-learning-models

weight 百度云下载地址:https://pan.baidu.com/s/1geHmOpH#list/path=%2Fkeras%2Fkeras_weights

用Keras fine tune的例子:

https://blog.keras.io/building-powerful-image-classification-models-using-very-little-data.html

https://deeplearningsandbox.com/how-to-use-transfer-learning-and-fine-tuning-in-keras-and-tensorflow-to-build-an-image-recognition-94b0b02444f2

https://github.com/FightForCS/cnn_finetune



2. tensorflow

tf现有可finetune的模型比keras要多

各种model的定义:https://github.com/tensorflow/models/tree/master/research/slim/nets

用tf fine tune的例子:

有官方例程(https://github.com/tensorflow/models/tree/master/research/slim),在slim下的train_image_classifier.py文件有具体描述,各部分代码拆解可以看

https://kwotsin.github.io/tech/2017/02/11/transfer-learning.html

https://kwotsin.github.io/tech/2017/01/29/tfrecords.html

其他用tf finetune的例子:https://github.com/joelthchao/tensorflow-finetune-flickr-style

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值