Baidu_AI_Studio_Learning_2

深度学习和神经网络的第一个例子

借鉴于百度AI Studio的教程
https://aistudio.baidu.com/aistudio/index

开始
结构:准备数据,配置神经网络,模型训练与评估,模型预测
重要函数numpy paddle paddle.fluid PIL import image matplotlib.pyplot as plt os
重要结构

手写数字识别的数据集
MNIST (2828pixel)(0~9)(灰度) 其年代十分久远
MINIST数据集包含60000个训练集和10000测试数据集。
分为图片和标签,图片是28
28的像素矩阵,标签为0~9共10个数字。
运行这个可以看做编程学习的"hello world"

step1:准备数据

#导入需要的包
import numpy as np
import paddle as paddle
import paddle.fluid as fluid
from PIL import Image
import matplotlib.pyplot as plt
import os

train_reader = paddle.batch(paddle.reader.shuffle(paddle.dataset.mnist.train(),
                                                  buf_size=512),
                    batch_size=128)
test_reader = paddle.batch(paddle.dataset.mnist.test(),
                           batch_size=128)
[==================================================]t/train-images-idx3-ubyte.gz not found, downloading https://dataset.bj.bcebos.com/mnist/train-images-idx3-ubyte.gz
[==================================================]t/train-labels-idx1-ubyte.gz not found, downloading https://dataset.bj.bcebos.com/mnist/train-labels-idx1-ubyte.gz
[==================================================]t/t10k-images-idx3-ubyte.gz not found, downloading https://dataset.bj.bcebos.com/mnist/t10k-images-idx3-ubyte.gz
[==================================================]t/t10k-labels-idx1-ubyte.gz not found, downloading https://dataset.bj.bcebos.com/mnist/t10k-labels-idx1-ubyte.gz

打印并观察mnist数据集

会得到一个非常长的。。数据集。。可以快进跳下去


temp_reader = paddle.batch(paddle.dataset.mnist.train(),
                           batch_size=1)
temp_data=next(temp_reader())
print(temp_data)


[(array([-1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -0.9764706 , -0.85882354, -0.85882354,
       -0.85882354, -0.01176471,  0.06666672,  0.37254906, -0.79607844,
        0.30196083,  1.        ,  0.9372549 , -0.00392157, -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -0.7647059 , -0.7176471 , -0.26274508,  0.20784318,
        0.33333337,  0.9843137 ,  0.9843137 ,  0.9843137 ,  0.9843137 ,
        0.9843137 ,  0.7647059 ,  0.34901965,  0.9843137 ,  0.8980392 ,
        0.5294118 , -0.4980392 , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -0.6156863 ,  0.8666667 ,
        0.9843137 ,  0.9843137 ,  0.9843137 ,  0.9843137 ,  0.9843137 ,
        0.9843137 ,  0.9843137 ,  0.9843137 ,  0.96862745, -0.27058822,
       -0.35686272, -0.35686272, -0.56078434, -0.69411767, -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -0.85882354,  0.7176471 ,  0.9843137 ,  0.9843137 ,
        0.9843137 ,  0.9843137 ,  0.9843137 ,  0.5529412 ,  0.427451  ,
        0.9372549 ,  0.8901961 , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -0.372549  ,  0.22352946, -0.1607843 ,  0.9843137 ,  0.9843137 ,
        0.60784316, -0.9137255 , -1.        , -0.6627451 ,  0.20784318,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -0.8901961 ,
       -0.99215686,  0.20784318,  0.9843137 , -0.29411763, -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        ,  0.09019613,
        0.9843137 ,  0.4901961 , -0.9843137 , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -0.9137255 ,  0.4901961 ,  0.9843137 ,
       -0.45098037, -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -0.7254902 ,  0.8901961 ,  0.7647059 ,  0.254902  ,
       -0.15294117, -0.99215686, -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -0.36470586,  0.88235295,  0.9843137 ,  0.9843137 , -0.06666666,
       -0.8039216 , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -0.64705884,
        0.45882356,  0.9843137 ,  0.9843137 ,  0.17647064, -0.7882353 ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -0.8745098 , -0.27058822,
        0.9764706 ,  0.9843137 ,  0.4666667 , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        ,  0.9529412 ,  0.9843137 ,
        0.9529412 , -0.4980392 , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -0.6392157 ,  0.0196079 ,
        0.43529415,  0.9843137 ,  0.9843137 ,  0.62352943, -0.9843137 ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -0.69411767,
        0.16078436,  0.79607844,  0.9843137 ,  0.9843137 ,  0.9843137 ,
        0.9607843 ,  0.427451  , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -0.8117647 , -0.10588235,  0.73333335,  0.9843137 ,  0.9843137 ,
        0.9843137 ,  0.9843137 ,  0.5764706 , -0.38823527, -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -0.81960785, -0.4823529 ,  0.67058825,  0.9843137 ,
        0.9843137 ,  0.9843137 ,  0.9843137 ,  0.5529412 , -0.36470586,
       -0.9843137 , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -0.85882354,  0.3411765 ,  0.7176471 ,
        0.9843137 ,  0.9843137 ,  0.9843137 ,  0.9843137 ,  0.5294118 ,
       -0.372549  , -0.92941177, -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -0.5686275 ,  0.34901965,
        0.77254903,  0.9843137 ,  0.9843137 ,  0.9843137 ,  0.9843137 ,
        0.9137255 ,  0.04313731, -0.9137255 , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        ,  0.06666672,  0.9843137 ,  0.9843137 ,  0.9843137 ,
        0.6627451 ,  0.05882359,  0.03529418, -0.8745098 , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        , -1.        ,
       -1.        , -1.        , -1.        , -1.        ], dtype=float32), 5)]

Step2:配置网络
定义一个简单的多层感知器,共三层
两个大小为100的隐层 一个大小为10的输出层(0~9)就是10个类别
最后输出层的激活函数设为Softmax 相当于一个分类器
加上输入层

输入–>>隐层–>>隐层–>>输出层

#定义多层感知器
def multilayer_perceptron(input):
    # 第一个全连接层,激活函数为ReLU
    hidden1 = fluid.layers.fc(input=input, size=100, act='relu')
    # 第二个全连接层,激活函数为ReLU
    hidden2 = fluid.layers.fc(input=hidden1, size=100, act='relu')
    # 以softmax为激活函数的全连接输出层,大小为10
    prediction = fluid.layers.fc(input=hidden2, size=10, act='softmax')
    return prediction

定义输入层
因为是2828的灰度图像(minst数据集)
所以输入形状是[1,28,28]
如果是RGB三色32
32
则为[3,32,32]

image = fluid.layers.data(name='image', shape=[1, 28, 28], dtype='float32')  #单通道,28*28像素值
label = fluid.layers.data(name='label', shape=[1], dtype='int64')            #图片标签

定义损失函数
交叉熵损失函数 在分类任务上常用 还要求它的平均值
因为定义的是一个Batch 的损失量
同时可以定义一个准确率函数,可以输出准确率

# 获取损失函数和准确率函数
cost = fluid.layers.cross_entropy(input=model, label=label)  
#使用cross_entropy交叉熵损失函数,描述真实样本标签和预测概率之间的差值
avg_cost = fluid.layers.mean(cost)#用mean()求平均值,from MATLAB
acc = fluid.layers.accuracy(input=model, label=label)#accuracy准确率

定义优化的方法
用adam方法 学习率为0.001

optimizer = fluid.optimizer.AdamOptimizer(learning_rate=0.001)  
 #使用Adam算法进行优化  (optimizer=优化)
opts = optimizer.minimize(avg_cost)#向最小方向优化损失函数

Step3:模型训练&&Step4:模型评估
定义一个解析器和初始化参数

# 定义一个使用CPU的解析器
place = fluid.CPUPlace()
exe = fluid.Executor(place)# Executor 是执行器
#上两行代码的 含义:
#定义一个CPU内的地址
#地址作为执行器的地址。。。
#{以上 两行 的 这些解释可能不够专业,但是我作为初学者理解是这样,或许有更好的说法请各位指导}
# 进行参数初始化
exe.run(fluid.default_startup_program())

两个输入维度
图像数据和图像数据标签
标签在这个例子中是0~9的整数

# 定义输入数据维度
feeder = fluid.DataFeeder(place=place, feed_list=[image, label])

开始训练
使用测试集

# 开始训练和测试
for pass_id in range(5):
    # 进行训练
    for batch_id, data in enumerate(train_reader()):#遍历train_reader(一个数据聚集)
        train_cost, train_acc =exe.run(program=fluid.default_main_program(),#运行主程序
                                        feed=feeder.feed(data),             #给模型喂入数据
                                        fetch_list=[avg_cost, acc])         #fetch 误差、准确率
        # 每100个batch打印一次信息  误差、准确率
        if batch_id % 100 == 0:
            print('Pass:%d, Batch:%d, Cost:%0.5f, Accuracy:%0.5f' %
                  (pass_id, batch_id, train_cost[0], train_acc[0]))

    # 进行测试
    test_accs = []
    test_costs = []
    #每训练一轮 进行一次测试
    for batch_id, data in enumerate(test_reader()):                         #遍历test_reader
        test_cost, test_acc = exe.run(program=fluid.default_main_program(), #执行训练程序
                                      feed=feeder.feed(data),               #喂入数据
                                      fetch_list=[avg_cost, acc])           #fetch 误差、准确率
        test_accs.append(test_acc[0])                                       #每个batch的准确率
        test_costs.append(test_cost[0])                                     #每个batch的误差
    # 求测试结果的平均值
    test_cost = (sum(test_costs) / len(test_costs))                         #每轮的平均误差
    test_acc = (sum(test_accs) / len(test_accs))                            #每轮的平均准确率
    print('Test:%d, Cost:%0.5f, Accuracy:%0.5f' % (pass_id, test_cost, test_acc))
    
    #保存模型
    model_save_dir = "/home/aistudio/data/hand.inference.model"
    # 如果保存路径不存在就创建
    if not os.path.exists(model_save_dir):
        os.makedirs(model_save_dir)
    print ('save models to %s' % (model_save_dir))
    fluid.io.save_inference_model(model_save_dir,  #保存推理model的路径
                                  ['image'],      #推理(inference)需要 feed 的数据
                                  [model],        #保存推理(inference)结果的 Variables
                                  exe)            #executor 保存 inference model
Pass:0, Batch:0, Cost:2.70130, Accuracy:0.05469
Pass:0, Batch:100, Cost:0.44905, Accuracy:0.84375
Pass:0, Batch:200, Cost:0.20944, Accuracy:0.93750
Pass:0, Batch:300, Cost:0.37832, Accuracy:0.85938
Pass:0, Batch:400, Cost:0.21634, Accuracy:0.93750
Test:0, Cost:0.22907, Accuracy:0.92880
save models to /home/aistudio/data/hand.inference.model
Pass:1, Batch:0, Cost:0.30485, Accuracy:0.91406
Pass:1, Batch:100, Cost:0.20843, Accuracy:0.95312
Pass:1, Batch:200, Cost:0.12292, Accuracy:0.96875
Pass:1, Batch:300, Cost:0.12543, Accuracy:0.95312
Pass:1, Batch:400, Cost:0.08486, Accuracy:0.97656
Test:1, Cost:0.15316, Accuracy:0.95095
save models to /home/aistudio/data/hand.inference.model
Pass:2, Batch:0, Cost:0.21079, Accuracy:0.92969
Pass:2, Batch:100, Cost:0.12976, Accuracy:0.95312
Pass:2, Batch:200, Cost:0.08817, Accuracy:0.97656
Pass:2, Batch:300, Cost:0.20444, Accuracy:0.94531
Pass:2, Batch:400, Cost:0.11258, Accuracy:0.95312
Test:2, Cost:0.11705, Accuracy:0.96222
save models to /home/aistudio/data/hand.inference.model
Pass:3, Batch:0, Cost:0.18898, Accuracy:0.95312
Pass:3, Batch:100, Cost:0.14870, Accuracy:0.94531
Pass:3, Batch:200, Cost:0.06573, Accuracy:0.97656
Pass:3, Batch:300, Cost:0.11360, Accuracy:0.97656
Pass:3, Batch:400, Cost:0.04338, Accuracy:0.98438
Test:3, Cost:0.09820, Accuracy:0.96786
save models to /home/aistudio/data/hand.inference.model
Pass:4, Batch:0, Cost:0.11982, Accuracy:0.96875
Pass:4, Batch:100, Cost:0.11513, Accuracy:0.97656
Pass:4, Batch:200, Cost:0.06515, Accuracy:0.99219
Pass:4, Batch:300, Cost:0.16725, Accuracy:0.96094
Pass:4, Batch:400, Cost:0.09474, Accuracy:0.98438
Test:4, Cost:0.08979, Accuracy:0.97083
save models to /home/aistudio/data/hand.inference.model

Step5:模型预测
几个重要概念:灰度化,一维向量,归一化
在预测之前,要对图像进行预处理,处理方式要跟训练的时候一样。首先进行灰度化,然后压缩图像大小为28*28,接着将图像转换成一维向量,最后再对一维向量进行归一化处理。

# 对图片进行预处理
def load_image(file):
    im = Image.open(file).convert('L')                        #将RGB转化为灰度图像,L代表灰度图像,灰度图像的像素值在0~255之间
    im = im.resize((28, 28), Image.ANTIALIAS)                 #resize image with high-quality 图像大小为28*28
    im = np.array(im).reshape(1, 1, 28, 28).astype(np.float32)#返回新形状的数组,把它变成一个 numpy 数组以匹配数据馈送格式。
   # print(im)
    im = im / 255.0 * 2.0 - 1.0                               #归一化到【-1~1】之间
    print(im)
    return im

img = Image.open('data/data27012/6.png')
plt.imshow(img)   #根据数组绘制图像
plt.show()        #显示图像

在这里插入图片描述

infer_exe = fluid.Executor(place)
inference_scope = fluid.core.Scope()

(下面有些许看不懂了)
最后把图像转换成一维向量并进行预测,数据从feed中的image传入。fetch_list的值是网络模型的最后一层分类器,所以输出的结果是10个标签的概率值,这些概率值的总和为1。

# 加载数据并开始预测
with fluid.scope_guard(inference_scope):
    #获取训练好的模型
    #从指定目录中加载 推理model(inference model)
    [inference_program,                                           #推理Program
     feed_target_names,                                           #是一个str列表,它包含需要在推理 Program 中提供数据的变量的名称。 
     fetch_targets] = fluid.io.load_inference_model(model_save_dir,#fetch_targets:是一个 Variable 列表,从中我们可以得到推断结果。model_save_dir:模型保存的路径
                                                    infer_exe)     #infer_exe: 运行 inference model的 executor
    img = load_image('data/data27012/6.png')

    results = exe.run(program=inference_program,     #运行推测程序
                   feed={feed_target_names[0]: img}, #喂入要预测的img
                   fetch_list=fetch_targets)         #得到推测结果,    

输出大量标签的概率值

[[[[-0.9843137  -0.9843137  -0.9843137  -0.9843137  -0.9843137
    -0.9843137  -0.9843137  -0.9843137  -0.9843137  -0.9843137
    -0.9843137  -0.9843137  -0.9843137  -0.9843137  -0.9843137
    -0.9843137  -0.9843137  -0.9607843  -0.8980392  -0.9764706
    -0.99215686 -0.96862745 -1.         -1.         -0.96862745
    -0.9607843  -0.99215686 -0.9843137 ]
   [-0.9843137  -0.9843137  -0.9843137  -0.9843137  -0.9843137
    -0.9843137  -0.9843137  -0.9843137  -0.9843137  -0.9843137
    -0.9843137  -0.9843137  -0.9843137  -0.9843137  -0.9843137
    -0.9843137  -0.9843137  -0.9764706  -0.94509804 -0.9372549
    -1.         -0.9843137  -0.19215685 -0.19999999 -0.7882353
    -0.9843137  -0.9764706  -0.9843137 ]
   [-0.9843137  -0.9843137  -0.9843137  -0.9843137  -0.9843137
    -0.9843137  -0.9843137  -0.9843137  -0.9843137  -0.9843137
    -0.9843137  -0.9843137  -0.9843137  -0.9843137  -0.9843137
    -0.9843137  -0.9843137  -1.         -1.         -0.9529412
    -0.9372549  -0.6862745   0.654902    0.654902   -0.54509807
    -1.         -0.96862745 -0.9843137 ]
   [-0.9843137  -0.9843137  -0.9843137  -0.9843137  -0.9843137
    -0.9843137  -0.9843137  -0.9843137  -0.9843137  -0.9843137
    -0.9843137  -0.9843137  -0.9843137  -0.9843137  -0.9843137
    -0.99215686 -0.9529412  -0.8509804  -0.9529412  -1.
    -0.94509804  0.00392163  0.81960785 -0.05098039 -0.8352941
    -0.9843137  -0.9764706  -0.9843137 ]
   [-0.9843137  -0.9843137  -0.9843137  -0.9843137  -0.9843137
    -0.9843137  -0.9843137  -0.9843137  -0.9843137  -0.9843137
    -0.9843137  -0.9843137  -0.9843137  -0.9843137  -0.9843137
    -0.9843137  -0.96862745 -0.88235295 -0.84313726 -0.7411765
    -0.1372549   0.6156863   0.09803927 -0.7882353  -1.
    -0.96862745 -0.9843137  -0.9843137 ]
   [-0.9843137  -0.9843137  -0.9843137  -0.9843137  -0.9843137
    -0.9843137  -0.9843137  -0.9843137  -0.9843137  -0.9843137
    -0.9843137  -0.9843137  -0.9843137  -0.9843137  -0.9843137
    -0.9843137  -0.96862745 -0.94509804 -0.92156863 -0.19999999
     0.827451    0.5764706  -0.78039217 -0.9529412  -0.94509804
    -0.99215686 -0.9843137  -0.9843137 ]
   [-0.9843137  -0.9843137  -0.9843137  -0.9843137  -0.9843137
    -0.9843137  -0.9843137  -0.9843137  -0.9843137  -0.9843137
    -0.9843137  -0.9843137  -0.9843137  -0.9843137  -0.9843137
    -0.9764706  -0.9529412  -0.88235295 -0.38823527  0.6156863
     0.4039216  -0.40392154 -0.9137255  -0.9529412  -0.99215686
    -1.         -0.9843137  -0.9843137 ]
   [-0.9843137  -0.9843137  -0.9843137  -0.9843137  -0.9843137
    -0.9843137  -0.9843137  -0.9843137  -0.9843137  -0.99215686
    -0.99215686 -0.99215686 -0.99215686 -0.9764706  -0.9607843
    -0.9843137  -1.         -0.5294118   0.67058825  0.5921569
    -0.5372549  -1.         -0.92156863 -0.96862745 -0.9764706
    -0.9764706  -0.9843137  -0.9843137 ]
   [-0.9843137  -0.9843137  -0.9843137  -0.9843137  -0.9843137
    -0.9843137  -0.9843137  -0.9843137  -0.9843137  -1.
    -0.99215686 -0.96862745 -0.9764706  -1.         -0.9529412
    -0.8509804  -0.60784316  0.28627455  0.7019608  -0.2862745
    -0.9607843  -0.9843137  -0.9607843  -0.9843137  -0.9607843
    -0.96862745 -0.99215686 -0.9843137 ]
   [-0.9843137  -0.9843137  -0.9843137  -0.9843137  -0.9843137
    -0.9843137  -0.9843137  -0.9843137  -1.         -0.9764706
    -0.9764706  -0.9607843  -0.92941177 -0.92941177 -0.81960785
    -0.31764704  0.4431373   0.5921569  -0.2235294  -0.77254903
    -0.9607843  -0.9529412  -0.99215686 -0.9529412  -0.9764706
    -1.         -0.9843137  -0.9843137 ]
   [-0.9843137  -0.9843137  -0.9843137  -0.9843137  -0.9843137
    -0.9843137  -0.9843137  -0.9843137  -0.9764706  -0.96862745
    -0.9607843  -0.9607843  -0.94509804 -0.75686276 -0.26274508
     0.45098042  0.7882353  -0.15294117 -0.85882354 -0.8352941
    -0.8980392  -0.9372549  -0.9843137  -0.8980392  -0.9607843
    -1.         -0.9843137  -0.9843137 ]
   [-0.9843137  -0.9843137  -0.9843137  -0.9843137  -0.9843137
    -0.9843137  -0.9843137  -0.9843137  -0.92941177 -0.96862745
    -0.90588236 -0.9137255  -0.9607843  -0.3333333   0.6313726
     0.69411767 -0.15294117 -0.9372549  -1.         -0.827451
    -0.9137255  -0.9607843  -0.99215686 -0.9529412  -0.9607843
    -0.9764706  -0.9843137  -0.9843137 ]
   [-0.9843137  -0.9843137  -0.9843137  -0.9843137  -0.9843137
    -0.9843137  -0.9843137  -0.9843137  -0.99215686 -0.94509804
    -0.96862745 -0.9137255  -0.34117645  0.48235297  0.58431375
    -0.18431371 -0.827451   -0.8039216  -0.9137255  -0.99215686
    -0.9607843  -0.8901961  -0.9764706  -1.         -0.94509804
    -0.9529412  -0.99215686 -0.9843137 ]
   [-0.9843137  -0.9843137  -0.9843137  -0.9843137  -0.9843137
    -0.9843137  -0.9843137  -0.9843137  -0.9372549  -0.9764706
    -0.9529412  -0.20784312  0.8666667   0.58431375 -0.42745095
    -0.5137255   0.2941177   0.17647064 -0.14509803 -0.5921569
    -0.8901961  -0.90588236 -0.96862745 -0.92941177 -0.96862745
    -0.99215686 -0.9843137  -0.9843137 ]
   [-0.9843137  -0.9843137  -0.9843137  -0.9843137  -0.9843137
    -0.9843137  -0.9843137  -0.9843137  -0.9843137  -0.9529412
    -0.3960784   0.54509807  0.94509804  0.67058825  0.5137255
     0.7019608   0.8509804   0.8039216   0.8745098   0.39607847
    -0.7647059  -1.         -0.94509804 -0.8666667  -0.9529412
    -1.         -0.9843137  -0.9843137 ]
   [-0.9843137  -0.9843137  -0.9843137  -0.9843137  -0.9843137
    -0.9764706  -0.9764706  -0.9764706  -0.9529412  -0.24705881
     0.6392157   0.9843137   0.8745098   0.8745098   0.88235295
     0.3803922  -0.27843136  0.09019613  0.92156863  0.84313726
    -0.7254902  -0.99215686 -0.85882354 -0.9137255  -0.9843137
    -0.99215686 -0.9843137  -0.9843137 ]
   [-0.9843137  -0.99215686 -0.9764706  -0.9529412  -0.8980392
    -0.8509804  -0.8509804  -0.92941177 -0.5058824   0.5058824
     0.9529412   0.8901961   0.49803925 -0.10588235 -0.5058824
    -0.7490196  -0.8980392  -0.3333333   0.8509804   0.34901965
    -0.6156863  -0.8039216  -0.92941177 -0.96862745 -0.9372549
    -0.9607843  -0.99215686 -0.9843137 ]
   [-0.9372549  -0.9372549  -0.96862745 -0.99215686 -0.9843137
    -0.94509804 -0.9607843  -0.94509804  0.26274514  0.96862745
     0.827451   -0.01176471 -0.77254903 -0.92156863 -0.9607843
    -0.92941177 -0.81960785  0.37254906  0.654902   -0.23921567
    -0.90588236 -0.96862745 -0.9843137  -0.9372549  -0.9529412
    -0.9843137  -0.9843137  -0.9843137 ]
   [-0.9843137  -0.96862745 -0.96862745 -0.9764706  -0.9764706
    -0.8745098  -0.84313726 -0.19999999  0.60784316  0.5058824
    -0.11372548 -0.70980394 -0.92941177 -0.8901961  -1.
    -0.7254902   0.20784318  0.6313726   0.07450986 -0.7176471
    -1.         -1.         -0.9607843  -0.8980392  -0.9607843
    -1.         -0.9843137  -0.9843137 ]
   [-0.9843137  -1.         -0.96862745 -0.8901961  -0.96862745
    -0.9607843  -0.8352941   0.4901961   0.92941177 -0.30196077
    -1.         -0.92156863 -1.         -0.8666667  -0.17647058
     0.56078434  0.73333335  0.03529418 -0.60784316 -0.9372549
    -0.9372549  -0.9607843  -0.9843137  -0.9607843  -0.9843137
    -1.         -0.9843137  -0.9843137 ]
   [-0.9529412  -1.         -0.9764706  -0.88235295 -0.9843137
    -1.         -0.9372549   0.49803925  0.99215686  0.45882356
     0.26274514  0.3411765   0.18431377  0.34901965  0.827451
     0.88235295  0.24705887 -0.654902   -0.9372549  -0.92941177
    -0.9137255  -0.96862745 -1.         -1.         -0.9529412
    -0.9607843  -0.99215686 -0.9843137 ]
   [-0.9764706  -0.99215686 -0.99215686 -0.9607843  -0.9607843
    -0.9137255  -0.9372549  -0.05882353  0.6784314   1.
     1.          1.          0.8352941   0.38823533  0.05882359
    -0.52156866 -0.8039216  -0.9843137  -0.99215686 -0.9529412
    -0.9843137  -0.9764706  -0.9372549  -0.9843137  -0.94509804
    -0.9529412  -0.99215686 -0.9843137 ]
   [-0.9764706  -0.96862745 -0.9843137  -1.         -0.9843137
    -0.92156863 -0.96862745 -0.73333335 -0.47450978 -0.36470586
    -0.38039213 -0.32549018 -0.41176468 -0.6784314  -0.81960785
    -0.8901961  -1.         -1.         -0.9843137  -0.99215686
    -1.         -0.92156863 -0.8745098  -0.92941177 -0.9607843
    -0.9764706  -0.9843137  -0.9843137 ]
   [-0.9843137  -0.9764706  -0.9529412  -0.9843137  -1.
    -0.9607843  -0.96862745 -1.         -1.         -1.
    -1.         -1.         -0.9137255  -0.85882354 -0.9372549
    -0.8980392  -0.9764706  -0.99215686 -0.9843137  -1.
    -0.9607843  -0.9372549  -0.94509804 -0.9137255  -0.94509804
    -0.9843137  -0.9843137  -0.9843137 ]
   [-0.99215686 -0.99215686 -0.9607843  -0.9764706  -0.9843137
    -0.96862745 -0.96862745 -0.9843137  -0.96862745 -0.9137255
    -0.9137255  -0.92941177 -0.85882354 -0.8352941  -0.94509804
    -1.         -0.9843137  -0.9843137  -0.9843137  -0.99215686
    -0.96862745 -0.9843137  -1.         -0.9607843  -0.96862745
    -0.9843137  -0.99215686 -0.99215686]
   [-0.9843137  -0.9843137  -0.9843137  -0.9843137  -0.9843137
    -0.9843137  -0.9843137  -0.9843137  -1.         -1.
    -1.         -0.9764706  -0.99215686 -0.99215686 -1.
    -1.         -1.         -0.99215686 -0.99215686 -0.99215686
    -1.         -0.99215686 -0.99215686 -1.         -0.99215686
    -0.99215686 -0.99215686 -0.99215686]
   [-0.9843137  -0.9843137  -0.9843137  -0.9843137  -0.9843137
    -0.9843137  -0.9843137  -0.9843137  -1.         -1.
    -0.99215686 -0.9764706  -0.9764706  -0.96862745 -0.99215686
    -1.         -1.         -0.99215686 -0.99215686 -0.99215686
    -0.99215686 -0.99215686 -0.99215686 -0.99215686 -0.99215686
    -0.99215686 -0.99215686 -0.99215686]
   [-0.9843137  -0.9843137  -0.9843137  -0.9843137  -0.9843137
    -0.9843137  -0.9843137  -0.9843137  -1.         -1.
    -0.99215686 -0.9764706  -0.9764706  -0.9764706  -0.99215686
    -1.         -1.         -0.99215686 -0.99215686 -0.99215686
    -0.99215686 -0.99215686 -0.99215686 -0.99215686 -0.99215686
    -0.99215686 -0.99215686 -0.99215686]]]]

拿到每个标签的概率值之后,我们要获取概率最大的标签,并打印出来。

# 获取概率最大的label
lab = np.argsort(results)                               #argsort函数返回的是result数组值从小到大的索引值
#print(lab)
print("该图片的预测结果的label为: %d" % lab[0][0][-1])  #-1代表读取数组中倒数第一列  

该图片的预测结果的label为: 6

小结:
Baidu_AI_Studio_Learning_1,2简单的做了深度学习的介绍以及其实现的一个例子
其实我觉得这个算方还可以再优化
不过在现在计算量下意义不大

重点是抓住结构
准备数据,配置神经网络,模型训练与评估,预测实验
结合paddlepaddle平台进行实践来检验

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值