Keras 开胃小菜之 MNIST 分类

MNIST 作为机器学习领域的 “Hello Word”,让我们能够非常直接的感受到机器学习的魅力。

本篇演示怎么用 keras 来搭建一个简单的 CNN 来分类 MNIST 数据集,虽然模型很简单,但准确率可不低哦:99.2%

#coding:utf-8

'''
基于 CNN 分类 MNIST

注意:在本文,keras 后端为 TensorFlow
keras 版本为:2.1.6
'''

import numpy as np
np.random.seed(1337)  # 使本文的程序可复现

# 导入必要的包
from keras.datasets import mnist
from keras.models import Sequential
from keras.layers import Conv2D, MaxPool2D, Dense, Dropout, Activation, Flatten
from keras.utils import np_utils


# 加载MNIST数据集
(X_train, y_train),(X_test, y_test) = mnist.load_data()
X_train = X_train.reshape(-1,28,28,1).astype(np.float32)
X_test = X_test.reshape(-1,28,28,1).astype(np.float32)
Y_train = np_utils.to_categorical(y_train, 10) # to one_hot
Y_test = np_utils.to_categorical(y_test, 10) # to one_hot

# 预处理
X_train /= 255.
X_test /= 255.
print(X_train.shape[0], 'train samples')
print(X_test.shape[0], 'test samples')


# 建立CNN模型
model = Sequential()
model.add(Conv2D(32, 3, input_shape=[28,28,1]))
model.add(Activation('relu'))
model.add(Conv2D(32, 3))
model.add(Activation('relu'))
model.add(MaxPool2D([2,2]))
model.add(Dropout(0.25))
model.add(Flatten())
model.add(Dense(128))
model.add(Activation('relu'))
model.add(Dropout(0.5))
model.add(Dense(10))
model.add(Activation('softmax'))

# 配置训练参数
model.compile(loss='categorical_crossentropy',
              optimizer='adadelta',
              metrics=['accuracy'])

# 模型训练
model.fit(X_train, Y_train, batch_size=256, epochs=12,
          verbose=1, validation_data=(X_test, Y_test))

# 模型测试
score = model.evaluate(X_test, Y_test, verbose=0)
print('Test loss:', score[0])
print('Test accuracy:', score[1])

模型运行结果:

Using TensorFlow backend.
60000 train samples
10000 test samples

Train on 60000 samples, validate on 10000 samples

Epoch 1/12
60000/60000 [====================] -6s -loss: 0.3998 -acc: 0.8762 -val_loss: 0.0883 -val_acc: 0.9733
Epoch 2/12
60000/60000 [====================] -3s -loss: 0.1240 -acc: 0.9634 -val_loss: 0.0610 -val_acc: 0.9808
Epoch 3/12
60000/60000 [====================] -4s -loss: 0.0900 -acc: 0.9734 -val_loss: 0.0436 -val_acc: 0.9853
Epoch 4/12
60000/60000 [====================] -4s -loss: 0.0736 -acc: 0.9783 -val_loss: 0.0370 -val_acc: 0.9879
Epoch 5/12
60000/60000 [====================] -4s -loss: 0.0633 -acc: 0.9813 -val_loss: 0.0327 -val_acc: 0.9889
Epoch 6/12
60000/60000 [====================] -4s -loss: 0.0583 -acc: 0.9824 -val_loss: 0.0338 -val_acc: 0.9884
Epoch 7/12
60000/60000 [====================] -4s -loss: 0.0510 -acc: 0.9848 -val_loss: 0.0330 -val_acc: 0.9886
Epoch 8/12
60000/60000 [====================] -4s -loss: 0.0473 -acc: 0.9860 -val_loss: 0.0314 -val_acc: 0.9902
Epoch 9/12
60000/60000 [====================] -3s -loss: 0.0428 -acc: 0.9867 -val_loss: 0.0315 -val_acc: 0.9905
Epoch 10/12
60000/60000 [====================] -4s -loss: 0.0399 -acc: 0.9876 -val_loss: 0.0267 -val_acc: 0.9904
Epoch 11/12
60000/60000 [====================] -4s -loss: 0.0372 -acc: 0.9884 -val_loss: 0.0262 -val_acc: 0.9919
Epoch 12/12
60000/60000 [====================] -4s -loss: 0.0361 -acc: 0.9886 -val_loss: 0.0270 -val_acc: 0.9920
Test loss: 0.02704390722082826
Test accuracy: 0.992
  • 1
    点赞
  • 2
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值