Tensorflow入门(谭秉峰)(六)MNIST数据集手写体识别

1. 加载已经下载好的数据集(MNIST)

# Author: Zubin
# -*- coding: utf-8 -*
from tensorflow.examples.tutorials.mnist import input_data

#载入MNIST数据集,如果指定地址没有下载好的数据,则会自动下载
mnist = input_data.read_data_sets("D:/Pycharm/MNIST_DEV/MNIST_data/",one_hot=True)
# 该函数会自动将MNIST数据集划分为train/test/validation(验证)三个数据集
# 2. 数据集会自动被分成3个子集,train、validation和test。以下代码会显示数据集的大小。
print("Training data size: ", mnist.train.num_examples)
print("Validating data size: ", mnist.validation.num_examples)
print("Testing data size: ", mnist.test.num_examples)

# 3. 查看training数据集中某个成员的像素矩阵生成的一维数组和其属于的数字标签。
print("Example training data: ", mnist.train.images[0])
print("Example training data label: ", mnist.train.labels[0])

# 4. 使用mnist.train.next_batch来实现随机梯度下降。
batch_size = 100
xs, ys = mnist.train.next_batch(batch_size)    # 从train的集合中选取batch_size个训练数据。
print("X shape:", xs.shape)
print("Y shape:", ys.shape)
Training data size:  55000
Validating data size:  5000
Testing data size:  10000
Example training data:  [0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.3803922  0.37647063 0.3019608
 0.46274513 0.2392157  0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.3529412
 0.5411765  0.9215687  0.9215687  0.9215687  0.9215687  0.9215687
 0.9215687  0.9843138  0.9843138  0.9725491  0.9960785  0.9607844
 0.9215687  0.74509805 0.08235294 0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.54901963 0.9843138  0.9960785  0.9960785
 0.9960785  0.9960785  0.9960785  0.9960785  0.9960785  0.9960785
 0.9960785  0.9960785  0.9960785  0.9960785  0.9960785  0.9960785
 0.7411765  0.09019608 0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.8862746  0.9960785  0.81568635 0.7803922  0.7803922  0.7803922
 0.7803922  0.54509807 0.2392157  0.2392157  0.2392157  0.2392157
 0.2392157  0.5019608  0.8705883  0.9960785  0.9960785  0.7411765
 0.08235294 0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.14901961 0.32156864
 0.0509804  0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.13333334 0.8352942  0.9960785  0.9960785  0.45098042 0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.32941177
 0.9960785  0.9960785  0.9176471  0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.32941177 0.9960785  0.9960785
 0.9176471  0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.4156863  0.6156863  0.9960785  0.9960785  0.95294124 0.20000002
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.09803922
 0.45882356 0.8941177  0.8941177  0.8941177  0.9921569  0.9960785
 0.9960785  0.9960785  0.9960785  0.94117653 0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.26666668 0.4666667  0.86274517 0.9960785  0.9960785
 0.9960785  0.9960785  0.9960785  0.9960785  0.9960785  0.9960785
 0.9960785  0.5568628  0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.14509805 0.73333335 0.9921569
 0.9960785  0.9960785  0.9960785  0.8745099  0.8078432  0.8078432
 0.29411766 0.26666668 0.8431373  0.9960785  0.9960785  0.45882356
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.4431373  0.8588236  0.9960785  0.9490197  0.89019614 0.45098042
 0.34901962 0.12156864 0.         0.         0.         0.
 0.7843138  0.9960785  0.9450981  0.16078432 0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.6627451  0.9960785
 0.6901961  0.24313727 0.         0.         0.         0.
 0.         0.         0.         0.18823531 0.9058824  0.9960785
 0.9176471  0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.07058824 0.48627454 0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.32941177 0.9960785  0.9960785  0.6509804  0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.54509807
 0.9960785  0.9333334  0.22352943 0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.8235295  0.9803922  0.9960785  0.65882355
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.9490197  0.9960785  0.93725497 0.22352943 0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.34901962 0.9843138  0.9450981
 0.3372549  0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.01960784 0.8078432  0.96470594 0.6156863  0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.01568628 0.45882356
 0.27058825 0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.         0.         0.
 0.         0.         0.         0.        ]
Example training data label:  [0. 0. 0. 0. 0. 0. 0. 1. 0. 0.]
X shape: (100, 784)
Y shape: (100, 10)

2.训练  使用2层隐含层的神经网络训练

# Author: Zubin
# -*- coding: utf-8 -*
from MNIST_defiction import *
import  tensorflow as tf

#每个批次的大小
batch_size=100
#计算一共有多少个批次
n_batch=mnist.train.num_examples//batch_size

#mnist数据集常数
INPUT_NODE=784
#神经元的个数
L1_NODE=500
OUTPUT_NODE=10



'''
定义两个placeholder.28x28的像素=784(将28*28像素的灰度图片转换成一个向量)
定义一层隐藏层,神经元个数为10个
'''

x=tf.placeholder(tf.float32,[None,784])
y=tf.placeholder(tf.float32,[None,10])

#创建一个简单的神经网络,权重初始化为0
W1=tf.Variable(tf.truncated_normal([INPUT_NODE,L1_NODE],stddev=0.1))
b1=tf.Variable(tf.constant(0.1,shape=[L1_NODE]))
W2=tf.Variable(tf.truncated_normal([L1_NODE,OUTPUT_NODE],stddev=0.1))
b2=tf.Variable(tf.constant(0.1,shape=[OUTPUT_NODE]))
Output=tf.nn.softmax(tf.matmul(x,W1)+b1)
prediction=tf.nn.softmax(tf.matmul(Output,W2)+b2)

#二次代价函数
loss=tf.reduce_mean(tf.square(y-prediction))
#使用Adam自适应优化算法
train_step=tf.train.AdamOptimizer(0.01).minimize(loss)

#初试化变量
init=tf.global_variables_initializer()

#将结果存放在一个布尔整型列表中
correct_prediction=tf.equal(tf.argmax(y,1),tf.argmax(prediction,1))
#求准确率函数:将布尔型转为整型
accuracy=tf.reduce_mean(tf.cast(correct_prediction,tf.float32))

with tf.Session() as sess:
    sess.run(init)
    #整个数据集迭代21次
    for epoch in range(51):
        #每一批训练多少张图片
        for batch in range (n_batch):
            #batch_x为图片,batch_ys为标签
             batch_xs,batch_ys=mnist.train.next_batch(batch_size)
             sess.run(train_step,feed_dict={x:batch_xs,y:batch_ys})

        #迭代一次完求准确率
        acc=sess.run(accuracy,feed_dict={x:mnist.test.images,y:mnist.test.labels})
        print("After "+str(epoch)+" training steps,"+"accuracy is "+str(acc))

 输出结果:

After 0 training steps,accuracy is 0.9286
After 1 training steps,accuracy is 0.9338
After 2 training steps,accuracy is 0.9356
After 3 training steps,accuracy is 0.9333
After 4 training steps,accuracy is 0.9406
After 5 training steps,accuracy is 0.942
After 6 training steps,accuracy is 0.9428
After 7 training steps,accuracy is 0.9403
After 8 training steps,accuracy is 0.9399
After 9 training steps,accuracy is 0.9463
After 10 training steps,accuracy is 0.9458
After 11 training steps,accuracy is 0.9469
After 12 training steps,accuracy is 0.9486
After 13 training steps,accuracy is 0.9467
After 14 training steps,accuracy is 0.9467
After 15 training steps,accuracy is 0.9489
After 16 training steps,accuracy is 0.9492
After 17 training steps,accuracy is 0.9478
After 18 training steps,accuracy is 0.9507
After 19 training steps,accuracy is 0.9484
After 20 training steps,accuracy is 0.9532
After 21 training steps,accuracy is 0.9489
After 22 training steps,accuracy is 0.9524
After 23 training steps,accuracy is 0.9527
After 24 training steps,accuracy is 0.953
After 25 training steps,accuracy is 0.9524
After 26 training steps,accuracy is 0.9521
After 27 training steps,accuracy is 0.9546
After 28 training steps,accuracy is 0.9497
After 29 training steps,accuracy is 0.9513
After 30 training steps,accuracy is 0.9544
After 31 training steps,accuracy is 0.9528
After 32 training steps,accuracy is 0.9531
After 33 training steps,accuracy is 0.9515
After 34 training steps,accuracy is 0.9509
After 35 training steps,accuracy is 0.9487
After 36 training steps,accuracy is 0.9512
After 37 training steps,accuracy is 0.9533
After 38 training steps,accuracy is 0.9539
After 39 training steps,accuracy is 0.9509
After 40 training steps,accuracy is 0.9527
After 41 training steps,accuracy is 0.9504
After 42 training steps,accuracy is 0.9554
After 43 training steps,accuracy is 0.9537
After 44 training steps,accuracy is 0.9516
After 45 training steps,accuracy is 0.9502
After 46 training steps,accuracy is 0.954
After 47 training steps,accuracy is 0.9543
After 48 training steps,accuracy is 0.9557
After 49 training steps,accuracy is 0.9541
After 50 training steps,accuracy is 0.9557

Process finished with exit code 0

 

 

 

 

 

 

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值