keras入门–Mnist手写体识别
介绍如何使用keras搭建一个多层感知机实现手写体识别及搭建一个神经网络最小的必备知识
import keras
Using TensorFlow backend.
dir(keras)
['Input',
'Model',
'Sequential',
'__builtins__',
'__cached__',
'__doc__',
'__file__',
'__loader__',
'__name__',
'__package__',
'__path__',
'__spec__',
'__version__',
'absolute_import',
'activations',
'applications',
'backend',
'callbacks',
'constraints',
'datasets',
'engine',
'initializers',
'layers',
'legacy',
'losses',
'metrics',
'models',
'optimizers',
'preprocessing',
'regularizers',
'utils',
'wrappers']
keras常用模块的简单介绍
- ‘Input’,‘Model’,‘Sequential’,这三个模块是以前老的接口,新的版本已经将它们融合到后面的模块当中
- 以’__'开头的模块是一些内嵌的模块
- 'activations’是激活函数,包括像sigmoid,relu,softmax等
- 'applications’是应用,这里面提供了已经训练好的keras模型,像图像识别的VGG等
- 'backend’是后端函数,keras通过它来操作其他的后端执行代码,像tensorflow,theano等,在后面使用models时,models会自动地调用
- 'callbacks’是一个回调的抽象函数,在高级应用里面可以用来展示训练过程中网络内部的状态
- 'constraints’是一个约束项,主要是能够对神经网络进行约束,来防止神经网络的过拟合
- 'datasets’里面包含了很多神经网络常用的数据集
- 'engine’是引擎模块,是layers的核心代码,主要是用来实现神经网络的拓补结构,后面的层的构建都是从这里继承而来
- 'initializers’是初始化方法
- 'layers’里面包含了keras已经实现的一些网络层,像全连接层Dense,卷积神经网络中的Conv
- 'legacy’是遗留代码,旧版本的代码都放在里面
- 'losses’是目标函数,也就损失函数,代价函数等,包括像均方差误差,交叉熵等等,用来衡量神经网络训练过程中的训练的好坏,能够看在迭代的过程中神经网络的一个训练情况
- 'metrics’是评估函数,可以用来评估神经网络的性能,里面包括像准确度,召回率等
- 'models’是模型库,Keras有两种类型的模型,序贯模型(Sequential)和函数式模型(Model),函数式模型应用更为广泛,序贯模型是函数式模型的一种特殊情况。序贯模型:使用序贯模型可以像搭积木一样一层一层地网上叠加神经网络
- 'optimizers’是优化器,神经网络编译时必备的参数之一,可以用来在神经网络训练过程当中来更新权值的一个方法
- 'preprocessing’是预处理模块,包括对数据,序列,文本以及图像数据的预处理
- 'regularizers’是正则化方法,是用来防止神经网络在训练过程中出现过拟合
- 'utils’工具模块,本模块提供了一系列有用工具,用于提供像数据转换,数据规范化等功能
- 'wrappers’包装器(层封装器),能够将普通层进行包装,比如将普通数据封装成时序数据
本次所用Keras基础模块
from keras.models import Sequential
model = Sequential()
model.add(Dense(8))
from keras.layers import Dense
Dense(units, activation=None, use_bias=True, kernel_initializer='glorot_uniform', bias_initializer='zeros', kernel_regularizer=None, bias_regularizer=None, activity_regularizer=None, kernel_constraint=None, bias_constraint=None, **kwargs)
from keras import optimizers
Init signature: optimizers.SGD(lr=0.01, momentum=0.0, decay=0.0, nesterov=False, **kwargs)
optimizers.SGD(lr=0.01, momentum=0.0, decay=0.0, nesterov=False)
<keras.optimizers.SGD at 0x2e8b5198>
手写体识别
import keras
from keras.datasets import mnist
from keras.models import Sequential
from keras.layers import Dense
from keras.optimizers import SGD
(x_train, y_train), (x_test, y_test) = mnist.load_data()
print(x_train.shape,y_train.shape)
print(x_test.shape,y_test.shape)
(60000, 28, 28) (60000,)
(10000, 28, 28) (10000,)
import matplotlib.pyplot as plt
im = plt.imshow(x_train[0],cmap='gray')
plt.show()
y_train[0]
<Figure size 640x480 with 1 Axes>
5
x_train = x_train.reshape(60000,784)
x_test = x_test.reshape(10000,784)
print(x_train.shape)
print(x_test.shape)
(60000, 784)
(10000, 784)
x_train[0]
array([ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 3, 18, 18, 18,
126, 136, 175, 26, 166, 255, 247, 127, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 30, 36, 94, 154, 170, 253,
253, 253, 253, 253, 225, 172, 253, 242, 195, 64, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 49, 238, 253, 253, 253,
253, 253, 253, 253, 253, 251, 93, 82, 82, 56, 39, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 18, 219, 253,
253, 253, 253, 253, 198, 182, 247, 241, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
80, 156, 107, 253, 253, 205, 11, 0, 43, 154, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 14, 1, 154, 253, 90, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 139, 253, 190, 2, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 11, 190, 253, 70,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 35,
241, 225, 160, 108, 1, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 81, 240, 253, 253, 119, 25, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 45, 186, 253, 253, 150, 27, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 16, 93, 252, 253, 187,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 249,
253, 249, 64, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 46, 130,
183, 253, 253, 207, 2, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 39, 148,
229, 253, 253, 253, 250, 182, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 24, 114,
221, 253, 253, 253, 253, 201, 78, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 23, 66,
213, 253, 253, 253, 253, 198, 81, 2, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 18, 171,
219, 253, 253, 253, 253, 195, 80, 9, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 55, 172,
226, 253, 253, 253, 253, 244, 133, 11, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
136, 253, 253, 253, 212, 135, 132, 16, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
0, 0, 0, 0], dtype=uint8)
对数据进行归一化处理
x_train = x_train / 255
x_test = x_test / 255
x_train[0]
array([0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0.01176471, 0.07058824, 0.07058824,
0.07058824, 0.49411765, 0.53333333, 0.68627451, 0.10196078,
0.65098039, 1. , 0.96862745, 0.49803922, 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0.11764706, 0.14117647, 0.36862745, 0.60392157,
0.66666667, 0.99215686, 0.99215686, 0.99215686, 0.99215686,
0.99215686, 0.88235294, 0.6745098 , 0.99215686, 0.94901961,
0.76470588, 0.25098039, 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0.19215686, 0.93333333,
0.99215686, 0.99215686, 0.99215686, 0.99215686, 0.99215686,
0.99215686, 0.99215686, 0.99215686, 0.98431373, 0.36470588,
0.32156863, 0.32156863, 0.21960784, 0.15294118, 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0.07058824, 0.85882353, 0.99215686, 0.99215686,
0.99215686, 0.99215686, 0.99215686, 0.77647059, 0.71372549,
0.96862745, 0.94509804, 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0.31372549, 0.61176471, 0.41960784, 0.99215686, 0.99215686,
0.80392157, 0.04313725, 0. , 0.16862745, 0.60392157,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0.05490196,
0.00392157, 0.60392157, 0.99215686, 0.35294118, 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0.54509804,
0.99215686, 0.74509804, 0.00784314, 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0.04313725, 0.74509804, 0.99215686,
0.2745098 , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0.1372549 , 0.94509804, 0.88235294, 0.62745098,
0.42352941, 0.00392157, 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0.31764706, 0.94117647, 0.99215686, 0.99215686, 0.46666667,
0.09803922, 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0.17647059,
0.72941176, 0.99215686, 0.99215686, 0.58823529, 0.10588235,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0.0627451 , 0.36470588,
0.98823529, 0.99215686, 0.73333333, 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0.97647059, 0.99215686,
0.97647059, 0.25098039, 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0.18039216, 0.50980392,
0.71764706, 0.99215686, 0.99215686, 0.81176471, 0.00784314,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0.15294118,
0.58039216, 0.89803922, 0.99215686, 0.99215686, 0.99215686,
0.98039216, 0.71372549, 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0.09411765, 0.44705882, 0.86666667, 0.99215686, 0.99215686,
0.99215686, 0.99215686, 0.78823529, 0.30588235, 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0.09019608, 0.25882353, 0.83529412, 0.99215686,
0.99215686, 0.99215686, 0.99215686, 0.77647059, 0.31764706,
0.00784314, 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0.07058824, 0.67058824, 0.85882353,
0.99215686, 0.99215686, 0.99215686, 0.99215686, 0.76470588,
0.31372549, 0.03529412, 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0.21568627, 0.6745098 ,
0.88627451, 0.99215686, 0.99215686, 0.99215686, 0.99215686,
0.95686275, 0.52156863, 0.04313725, 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0.53333333, 0.99215686, 0.99215686, 0.99215686,
0.83137255, 0.52941176, 0.51764706, 0.0627451 , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. ])
对y标签进行处理,5 --> [ 0, 0, 0, 0, 0,1, 0, 0, 0, 0] ,使用keras的utils工具集中的函数可以做到
y_train = keras.utils.to_categorical(y_train,10)
y_test = keras.utils.to_categorical(y_test,10)
构建模型
model = Sequential()
model.add(Dense(512,activation='relu',input_shape=(784,)))
model.add(Dense(256,activation='relu'))
model.add(Dense(10,activation='softmax'))
model.summary()
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
dense_2 (Dense) (None, 512) 401920
_________________________________________________________________
dense_3 (Dense) (None, 256) 131328
_________________________________________________________________
dense_4 (Dense) (None, 10) 2570
=================================================================
Total params: 535,818
Trainable params: 535,818
Non-trainable params: 0
_________________________________________________________________
将神经网络进行编译
model.compile(optimizer=SGD(),loss='categorical_crossentropy',metrics=['accuracy'])
模型的训练
model.fit(x_train,y_train,batch_size=64,epochs=20,validation_data=(x_test,y_test))
Train on 60000 samples, validate on 10000 samples
Epoch 1/20
60000/60000 [==============================] - 5s 88us/step - loss: 0.2047 - acc: 0.9422 - val_loss: 0.1933 - val_acc: 0.9443
Epoch 2/20
60000/60000 [==============================] - 5s 86us/step - loss: 0.1883 - acc: 0.9469 - val_loss: 0.1811 - val_acc: 0.9490
Epoch 3/20
60000/60000 [==============================] - 5s 85us/step - loss: 0.1740 - acc: 0.9511 - val_loss: 0.1678 - val_acc: 0.9513
Epoch 4/20
60000/60000 [==============================] - 5s 85us/step - loss: 0.1618 - acc: 0.9545 - val_loss: 0.1594 - val_acc: 0.9533
Epoch 5/20
60000/60000 [==============================] - 5s 86us/step - loss: 0.1511 - acc: 0.9575 - val_loss: 0.1508 - val_acc: 0.9556
Epoch 6/20
60000/60000 [==============================] - 5s 86us/step - loss: 0.1416 - acc: 0.9602 - val_loss: 0.1421 - val_acc: 0.9579
Epoch 7/20
60000/60000 [==============================] - 5s 86us/step - loss: 0.1334 - acc: 0.9629 - val_loss: 0.1345 - val_acc: 0.9598
Epoch 8/20
60000/60000 [==============================] - 5s 87us/step - loss: 0.1255 - acc: 0.9650 - val_loss: 0.1295 - val_acc: 0.9608
Epoch 9/20
60000/60000 [==============================] - 5s 87us/step - loss: 0.1189 - acc: 0.9667 - val_loss: 0.1228 - val_acc: 0.9628
Epoch 10/20
60000/60000 [==============================] - 5s 88us/step - loss: 0.1124 - acc: 0.9687 - val_loss: 0.1182 - val_acc: 0.9647
Epoch 11/20
60000/60000 [==============================] - 5s 90us/step - loss: 0.1065 - acc: 0.9701 - val_loss: 0.1149 - val_acc: 0.9667
Epoch 12/20
60000/60000 [==============================] - 5s 88us/step - loss: 0.1012 - acc: 0.9722 - val_loss: 0.1103 - val_acc: 0.9671
Epoch 13/20
60000/60000 [==============================] - 5s 89us/step - loss: 0.0965 - acc: 0.9729 - val_loss: 0.1076 - val_acc: 0.9687
Epoch 14/20
60000/60000 [==============================] - 5s 89us/step - loss: 0.0918 - acc: 0.9751 - val_loss: 0.1025 - val_acc: 0.9692
Epoch 15/20
60000/60000 [==============================] - 5s 89us/step - loss: 0.0878 - acc: 0.9758 - val_loss: 0.1013 - val_acc: 0.9704
Epoch 16/20
60000/60000 [==============================] - 5s 90us/step - loss: 0.0837 - acc: 0.9772 - val_loss: 0.1009 - val_acc: 0.9702
Epoch 17/20
60000/60000 [==============================] - 5s 90us/step - loss: 0.0801 - acc: 0.9783 - val_loss: 0.0949 - val_acc: 0.9719
Epoch 18/20
60000/60000 [==============================] - 5s 90us/step - loss: 0.0764 - acc: 0.9791 - val_loss: 0.0928 - val_acc: 0.9725
Epoch 19/20
60000/60000 [==============================] - 5s 91us/step - loss: 0.0733 - acc: 0.9799 - val_loss: 0.0922 - val_acc: 0.9715
Epoch 20/20
60000/60000 [==============================] - 6s 93us/step - loss: 0.0704 - acc: 0.9809 - val_loss: 0.0907 - val_acc: 0.9729
<keras.callbacks.History at 0x3265e860>
模型的得分情况
score = model.evaluate(x_test,y_test)
model.save('mnist_model.h5')
print("loss:",score[0])
print("accu:",score[1])
10000/10000 [==============================] - 0s 42us/step
loss: 0.09071610405351967
accu: 0.9729