python3 29.keras中h5模型保存和恢复方法 学习笔记

前言

     计算机视觉系列之学习笔记主要是本人进行学习人工智能(计算机视觉方向)的代码整理。本系列所有代码是用python3编写,在平台Anaconda中运行实现,在使用代码时,默认你已经安装相关的python库,这方面不做多余的说明。本系列所涉及的所有代码和资料可在我的github上下载到,gitbub地址:https://github.com/mcyJacky/DeepLearning-CV,如有问题,欢迎指出。

一、h5模型保存

     keras在训练模型过程中可以将模型保存为h5文件。首先需要先安装h5py库,pip install h5py。我们使用之前的MNIST分类的例子,对其训练结果进行保存,它的使用方法如下:

import numpy as np
from keras.datasets import mnist
# keras提供的numpy工具包
from keras.utils import np_utils
from keras.models import Sequential
from keras.layers import Dense
from keras.optimizers import SGD

# 载入数据
(x_train,y_train), (x_test,y_test) = mnist.load_data()

# (6000,28,28)
print('x_shape:', x_train.shape)
# (6000)
print('y_shape:', y_train.shape)

# 进行数据转换,并归一化
# (60000,28,28) -> (60000, 784)
x_train = x_train.reshape(x_train.shape[0], -1)/255.0
x_test = x_test.reshape(x_test.shape[0], -1)/255.0
# 换one hot格式
y_train = np_utils.to_categorical(y_train, num_classes=10)
y_test = np_utils.to_categorical(y_test, num_classes=10)

# 创建模型:输入784个神经元,输出10个神经元 784-10
model = Sequential([
        Dense(units=10, input_dim=784, bias_initializer='one', activation='softmax')
    ])

# 定义优化
sgd = SGD(lr=0.2)

# 定义优化器,loss function, 训练过程中的准确率
model.compile(
    optimizer = sgd,
    loss = 'mse',
    metrics = ['accuracy']
)

# 进行模型训练
model.fit(x_train, y_train, batch_size=32, epochs=10)


# 评估模型
loss, accuracy = model.evaluate(x_test, y_test)

print('\ntest loss:', loss)
print('accuracy:', accuracy)

# 模型保存
model.save('model.h5') # HDF5文件,pip install h5py

# 打印输出结果:
# x_shape: (60000, 28, 28)
# y_shape: (60000,)
# Epoch 1/10
# 60000/60000 [==============================] - 2s 36us/step - loss: 0.0373 - acc: 0.7794
# Epoch 2/10
# 60000/60000 [==============================] - 2s 32us/step - loss: 0.0203 - acc: 0.8809
# Epoch 3/10
# 60000/60000 [==============================] - 2s 32us/step - loss: 0.0177 - acc: 0.8932
# Epoch 4/10
# 60000/60000 [==============================] - 2s 33us/step - loss: 0.0165 - acc: 0.8996
# Epoch 5/10
# 60000/60000 [==============================] - 2s 38us/step - loss: 0.0156 - acc: 0.9036
# Epoch 6/10
# 60000/60000 [==============================] - 2s 35us/step - loss: 0.0151 - acc: 0.9065
# Epoch 7/10
# 60000/60000 [==============================] - 2s 34us/step - loss: 0.0146 - acc: 0.9089
# Epoch 8/10
# 60000/60000 [==============================] - 2s 34us/step - loss: 0.0143 - acc: 0.9106
# Epoch 9/10
# 60000/60000 [==============================] - 2s 34us/step - loss: 0.0140 - acc: 0.9124
# Epoch 10/10
# 60000/60000 [==============================] - 2s 36us/step - loss: 0.0138 - acc: 0.9137
# 10000/10000 [==============================] - 0s 18us/step

# test loss: 0.013036302373278887
# accuracy: 0.9181

     通过运行上述程序,可以将训练模型通过save()方法保存为model.h5文件,下面我们可以对模型进行恢复。

二、h5模型恢复和继续训练

     下面可以可以通过load_model()方法,对保存的模型进行恢复或者可以对模型进行继续训练。具体如下:

import numpy as np
from keras.datasets import mnist
from keras.utils import np_utils
from keras.models import Sequential
from keras.layers import Dense
from keras.optimizers import SGD
from keras.models import load_model

# 载入数据
(x_train,y_train),(x_test,y_test) = mnist.load_data()
# (60000,28,28)
print('x_shape:',x_train.shape)
# (60000)
print('y_shape:',y_train.shape)
# (60000,28,28)->(60000,784)
x_train = x_train.reshape(x_train.shape[0],-1)/255.0
x_test = x_test.reshape(x_test.shape[0],-1)/255.0
# 换one hot格式
y_train = np_utils.to_categorical(y_train,num_classes=10)
y_test = np_utils.to_categorical(y_test,num_classes=10)

# 载入模型
model = load_model('model.h5')

# 评估模型
loss, accuracy = model.evaluate(x_test, y_test)
print('\ntest loss', loss)
print('accuracy', accuracy)

# 打印结果如下:
# x_shape: (60000, 28, 28)
# y_shape: (60000,)
# 10000/10000 [==============================] - 0s 20us/step
#
# test loss 0.013036302373278887
# accuracy 0.9181


# 继续训练模型
model.fit(x_train, y_train, batch_size=64, epochs=2)

# 评估模型
loss, accuracy = model.evaluate(x_test, y_test)

print('\ntest loss', loss)
print('accuracy', accuracy)

# 打印结果如下:
# Epoch 1/2
# 60000/60000 [==============================] - 1s 23us/step - loss: 0.0136 - acc: 0.9151
# Epoch 2/2
# 60000/60000 [==============================] - 1s 21us/step - loss: 0.0135 - acc: 0.9156
# 10000/10000 [==============================] - 0s 16us/step

# test loss 0.012842106776637956
# accuracy 0.9187

三、h5模型保存和载入参数

     当然模型也能只保存参数,并载入参数进行使用,具体如下

# 保存参数,载入参数
model.save_weights('my_model_weights.h5')
model.load_weights('my_model_weights.h5')

四、h5模型保存和载入网络结构

     当然模型也能不同的网络结构,如json格式,具体示例如下:

# 保存网络结构,载入网络结构
from keras.models import model_from_json
json_string = model.to_json()
model = model_from_json(json_string)

print(json_string)
# 打印结果如下:
# {"config": {"name": "sequential_1", "layers": [{"config": {"kernel_initializer": {"config": {"mode": "fan_avg", "seed": null, "distribution": "uniform", "scale": 1.0}, "class_name": "VarianceScaling"}, "bias_regularizer": null, "dtype": "float32", "activation": "softmax", "units": 10, "use_bias": true, "bias_initializer": {"config": {}, "class_name": "Ones"}, "bias_constraint": null, "batch_input_shape": [null, 784], "activity_regularizer": null, "trainable": true, "kernel_regularizer": null, "name": "dense_1", "kernel_constraint": null}, "class_name": "Dense"}]}, "backend": "tensorflow", "class_name": "Sequential", "keras_version": "2.2.4"}

     
     
     
     
【参考】:
     1. 城市数据团课程《AI工程师》计算机视觉方向
     2. deeplearning.ai 吴恩达《深度学习工程师》
     3. 《机器学习》作者:周志华
     4. 《深度学习》作者:Ian Goodfellow


转载声明:
版权声明:非商用自由转载-保持署名-注明出处
署名 :mcyJacky
文章出处:https://blog.csdn.net/mcyJacky

评论 2
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值