【深度学习实战】p7VGG-16实现咖啡豆识别

代码

import matplotlib.pyplot as plt
#隐藏警告
import warnings
warnings.filterwarnings("ignore")               #忽略警告信息
plt.rcParams['font.sans-serif']    = ['SimHei'] # 用来正常显示中文标签
plt.rcParams['axes.unicode_minus'] = False      # 用来正常显示负号
plt.rcParams['figure.dpi']         = 100        #分辨率

epochs_range = range(epochs)

plt.figure(figsize=(12, 3))
plt.subplot(1, 2, 1)

plt.plot(epochs_range, train_acc, label='Training Accuracy')
plt.plot(epochs_range, test_acc, label='Test Accuracy')
plt.legend(loc='lower right')
plt.title('Training and Validation Accuracy')

plt.subplot(1, 2, 2)
plt.plot(epochs_range, train_loss, label='Training Loss')
plt.plot(epochs_range, test_loss, label='Test Loss')
plt.legend(loc='upper right')
plt.title('Training and Validation Loss')
plt.show()

在这里插入图片描述

from PIL import Image 

classes = list(total_data.class_to_idx)

def predict_one_image(image_path, model, transform, classes):
    
    test_img = Image.open(image_path).convert('RGB')
    plt.imshow(test_img)  # 展示预测的图片

    test_img = transform(test_img)
    img = test_img.to(device).unsqueeze(0)
    
    model.eval()
    output = model(img)

    _,pred = torch.max(output,1)
    pred_class = classes[pred]
    print(f'预测结果是:{pred_class}')

# 预测训练集中的某张照片
predict_one_image(image_path='./data/p/p7-data/Dark/dark (99).png', 
                  model=model, 
                  transform=train_transforms, 
                  classes=classes)

预测结果是:Dark
在这里插入图片描述

训练结果

VGG16训练结果:

在这里插入图片描述

调整为SGD

在这里插入图片描述

调整网络结构,略微减少了一些参数

==========================================================================================
Layer (type:depth-idx)                   Output Shape              Param #
==========================================================================================
├─Sequential: 1-1                        [-1, 64, 112, 112]        --
|    └─Conv2d: 2-1                       [-1, 64, 224, 224]        1,792
|    └─ReLU: 2-2                         [-1, 64, 224, 224]        --
|    └─Conv2d: 2-3                       [-1, 64, 224, 224]        36,928
|    └─ReLU: 2-4                         [-1, 64, 224, 224]        --
|    └─MaxPool2d: 2-5                    [-1, 64, 112, 112]        --
├─Sequential: 1-2                        [-1, 128, 56, 56]         --
|    └─Conv2d: 2-6                       [-1, 128, 112, 112]       73,856
|    └─ReLU: 2-7                         [-1, 128, 112, 112]       --
|    └─Conv2d: 2-8                       [-1, 128, 112, 112]       147,584
|    └─ReLU: 2-9                         [-1, 128, 112, 112]       --
|    └─MaxPool2d: 2-10                   [-1, 128, 56, 56]         --
├─Sequential: 1-3                        [-1, 256, 28, 28]         --
|    └─Conv2d: 2-11                      [-1, 256, 56, 56]         295,168
|    └─ReLU: 2-12                        [-1, 256, 56, 56]         --
|    └─Conv2d: 2-13                      [-1, 256, 56, 56]         590,080
|    └─ReLU: 2-14                        [-1, 256, 56, 56]         --
|    └─MaxPool2d: 2-15                   [-1, 256, 28, 28]         --
├─Sequential: 1-4                        [-1, 512, 14, 14]         --
|    └─Conv2d: 2-16                      [-1, 512, 28, 28]         1,180,160
|    └─ReLU: 2-17                        [-1, 512, 28, 28]         --
|    └─Conv2d: 2-18                      [-1, 512, 28, 28]         2,359,808
|    └─ReLU: 2-19                        [-1, 512, 28, 28]         --
|    └─MaxPool2d: 2-20                   [-1, 512, 14, 14]         --
├─Sequential: 1-5                        [-1, 512, 7, 7]           --
|    └─Conv2d: 2-21                      [-1, 512, 14, 14]         2,359,808
|    └─ReLU: 2-22                        [-1, 512, 14, 14]         --
|    └─Conv2d: 2-23                      [-1, 512, 14, 14]         2,359,808
|    └─ReLU: 2-24                        [-1, 512, 14, 14]         --
|    └─MaxPool2d: 2-25                   [-1, 512, 7, 7]           --
├─Sequential: 1-6                        [-1, 4]                   --
|    └─Linear: 2-26                      [-1, 4096]                102,764,544
|    └─ReLU: 2-27                        [-1, 4096]                --
|    └─Linear: 2-28                      [-1, 4096]                16,781,312
|    └─ReLU: 2-29                        [-1, 4096]                --
|    └─Linear: 2-30                      [-1, 4]                   16,388
==========================================================================================
Total params: 128,967,236
Trainable params: 128,967,236
Non-trainable params: 0
Total mult-adds (G): 11.43
==========================================================================================
Input size (MB): 0.57
Forward/backward pass size (MB): 93.47
Params size (MB): 491.97
Estimated Total Size (MB): 586.01
==========================================================================================

在这里插入图片描述
从训练结果上来说,基本和满血VGG16一致

降参数

调小卷积核,增加卷积层,减小全连接层,能一定程度减少参数

Using cuda device
==========================================================================================
Layer (type:depth-idx)                   Output Shape              Param #
==========================================================================================
├─Sequential: 1-1                        [-1, 64, 112, 112]        --
|    └─Conv2d: 2-1                       [-1, 64, 224, 224]        1,792
|    └─ReLU: 2-2                         [-1, 64, 224, 224]        --
|    └─Conv2d: 2-3                       [-1, 64, 224, 224]        36,928
|    └─ReLU: 2-4                         [-1, 64, 224, 224]        --
|    └─MaxPool2d: 2-5                    [-1, 64, 112, 112]        --
├─Sequential: 1-2                        [-1, 128, 56, 56]         --
|    └─Conv2d: 2-6                       [-1, 128, 112, 112]       73,856
|    └─ReLU: 2-7                         [-1, 128, 112, 112]       --
|    └─MaxPool2d: 2-8                    [-1, 128, 56, 56]         --
├─Sequential: 1-3                        [-1, 256, 28, 28]         --
|    └─Conv2d: 2-9                       [-1, 256, 56, 56]         295,168
|    └─ReLU: 2-10                        [-1, 256, 56, 56]         --
|    └─Conv2d: 2-11                      [-1, 256, 56, 56]         590,080
|    └─ReLU: 2-12                        [-1, 256, 56, 56]         --
|    └─MaxPool2d: 2-13                   [-1, 256, 28, 28]         --
├─Sequential: 1-4                        [-1, 512, 15, 15]         --
|    └─Conv2d: 2-14                      [-1, 512, 29, 29]         524,800
|    └─ReLU: 2-15                        [-1, 512, 29, 29]         --
|    └─Conv2d: 2-16                      [-1, 512, 30, 30]         1,049,088
|    └─ReLU: 2-17                        [-1, 512, 30, 30]         --
|    └─Conv2d: 2-18                      [-1, 512, 31, 31]         1,049,088
|    └─ReLU: 2-19                        [-1, 512, 31, 31]         --
|    └─MaxPool2d: 2-20                   [-1, 512, 15, 15]         --
├─Sequential: 1-5                        [-1, 512, 9, 9]           --
|    └─Conv2d: 2-21                      [-1, 512, 16, 16]         1,049,088
|    └─ReLU: 2-22                        [-1, 512, 16, 16]         --
|    └─Conv2d: 2-23                      [-1, 512, 17, 17]         1,049,088
|    └─ReLU: 2-24                        [-1, 512, 17, 17]         --
|    └─Conv2d: 2-25                      [-1, 512, 18, 18]         1,049,088
|    └─ReLU: 2-26                        [-1, 512, 18, 18]         --
|    └─MaxPool2d: 2-27                   [-1, 512, 9, 9]           --
├─Sequential: 1-6                        [-1, 512, 3, 3]           --
|    └─Conv2d: 2-28                      [-1, 512, 9, 9]           2,359,808
|    └─ReLU: 2-29                        [-1, 512, 9, 9]           --
|    └─MaxPool2d: 2-30                   [-1, 512, 3, 3]           --
├─Sequential: 1-7                        [-1, 4]                   --
|    └─Linear: 2-31                      [-1, 4096]                18,878,464
|    └─ReLU: 2-32                        [-1, 4096]                --
|    └─Linear: 2-33                      [-1, 4096]                16,781,312
|    └─ReLU: 2-34                        [-1, 4096]                --
|    └─Linear: 2-35                      [-1, 4]                   16,388
==========================================================================================
Total params: 44,804,036
Trainable params: 44,804,036
Non-trainable params: 0
Total mult-adds (G): 9.21
==========================================================================================
Input size (MB): 0.57
Forward/backward pass size (MB): 87.83
Params size (MB): 170.91
Estimated Total Size (MB): 259.32
==========================================================================================
==========================================================================================
Layer (type:depth-idx)                   Output Shape              Param #
==========================================================================================
├─Sequential: 1-1                        [-1, 64, 112, 112]        --
|    └─Conv2d: 2-1                       [-1, 64, 224, 224]        1,792
|    └─ReLU: 2-2                         [-1, 64, 224, 224]        --
|    └─Conv2d: 2-3                       [-1, 64, 224, 224]        36,928
|    └─ReLU: 2-4                         [-1, 64, 224, 224]        --
|    └─MaxPool2d: 2-5                    [-1, 64, 112, 112]        --
├─Sequential: 1-2                        [-1, 128, 56, 56]         --
|    └─Conv2d: 2-6                       [-1, 128, 112, 112]       73,856
|    └─ReLU: 2-7                         [-1, 128, 112, 112]       --
|    └─MaxPool2d: 2-8                    [-1, 128, 56, 56]         --
├─Sequential: 1-3                        [-1, 256, 28, 28]         --
|    └─Conv2d: 2-9                       [-1, 256, 56, 56]         295,168
|    └─ReLU: 2-10                        [-1, 256, 56, 56]         --
|    └─Conv2d: 2-11                      [-1, 256, 56, 56]         590,080
|    └─ReLU: 2-12                        [-1, 256, 56, 56]         --
|    └─MaxPool2d: 2-13                   [-1, 256, 28, 28]         --
├─Sequential: 1-4                        [-1, 512, 15, 15]         --
|    └─Conv2d: 2-14                      [-1, 512, 29, 29]         524,800
|    └─ReLU: 2-15                        [-1, 512, 29, 29]         --
|    └─Conv2d: 2-16                      [-1, 512, 30, 30]         1,049,088
|    └─ReLU: 2-17                        [-1, 512, 30, 30]         --
|    └─Conv2d: 2-18                      [-1, 512, 31, 31]         1,049,088
|    └─ReLU: 2-19                        [-1, 512, 31, 31]         --
|    └─MaxPool2d: 2-20                   [-1, 512, 15, 15]         --
├─Sequential: 1-5                        [-1, 512, 9, 9]           --
|    └─Conv2d: 2-21                      [-1, 512, 16, 16]         1,049,088
|    └─ReLU: 2-22                        [-1, 512, 16, 16]         --
|    └─Conv2d: 2-23                      [-1, 512, 17, 17]         1,049,088
|    └─ReLU: 2-24                        [-1, 512, 17, 17]         --
|    └─Conv2d: 2-25                      [-1, 512, 18, 18]         1,049,088
|    └─ReLU: 2-26                        [-1, 512, 18, 18]         --
|    └─MaxPool2d: 2-27                   [-1, 512, 9, 9]           --
├─Sequential: 1-6                        [-1, 512, 3, 3]           --
|    └─Conv2d: 2-28                      [-1, 512, 9, 9]           2,359,808
|    └─ReLU: 2-29                        [-1, 512, 9, 9]           --
|    └─MaxPool2d: 2-30                   [-1, 512, 3, 3]           --
├─Sequential: 1-7                        [-1, 4]                   --
|    └─Linear: 2-31                      [-1, 4096]                18,878,464
|    └─ReLU: 2-32                        [-1, 4096]                --
|    └─Linear: 2-33                      [-1, 4096]                16,781,312
|    └─ReLU: 2-34                        [-1, 4096]                --
|    └─Linear: 2-35                      [-1, 4]                   16,388
==========================================================================================
Total params: 44,804,036
Trainable params: 44,804,036
Non-trainable params: 0
Total mult-adds (G): 9.21
==========================================================================================
Input size (MB): 0.57
Forward/backward pass size (MB): 87.83
Params size (MB): 170.91
Estimated Total Size (MB): 259.32
==========================================================================================

参数情况:从134,276,932降到了44,804,036,数量是原来的1/3,测试结果尚可

Epoch: 1, Train_acc:24.4%, Train_loss:1.389, Test_acc:23.8%, Test_loss:1.386, Lr:1.00E-04
Epoch: 2, Train_acc:39.4%, Train_loss:1.173, Test_acc:47.9%, Test_loss:1.004, Lr:1.00E-04
Epoch: 3, Train_acc:56.5%, Train_loss:0.768, Test_acc:59.2%, Test_loss:0.831, Lr:1.00E-04
Epoch: 4, Train_acc:72.3%, Train_loss:0.604, Test_acc:81.2%, Test_loss:0.548, Lr:1.00E-04
Epoch: 5, Train_acc:85.5%, Train_loss:0.334, Test_acc:69.6%, Test_loss:0.862, Lr:1.00E-04
Epoch: 6, Train_acc:82.2%, Train_loss:0.407, Test_acc:83.8%, Test_loss:0.400, Lr:1.00E-04
Epoch: 7, Train_acc:91.6%, Train_loss:0.229, Test_acc:90.0%, Test_loss:0.253, Lr:1.00E-04
Epoch: 8, Train_acc:90.6%, Train_loss:0.240, Test_acc:85.4%, Test_loss:0.366, Lr:1.00E-04
Epoch: 9, Train_acc:94.2%, Train_loss:0.164, Test_acc:93.8%, Test_loss:0.186, Lr:1.00E-04
Epoch:10, Train_acc:94.1%, Train_loss:0.146, Test_acc:93.3%, Test_loss:0.161, Lr:1.00E-04
Epoch:11, Train_acc:96.8%, Train_loss:0.082, Test_acc:94.6%, Test_loss:0.138, Lr:1.00E-04
Epoch:12, Train_acc:97.5%, Train_loss:0.073, Test_acc:94.2%, Test_loss:0.203, Lr:1.00E-04
Epoch:13, Train_acc:98.1%, Train_loss:0.058, Test_acc:94.6%, Test_loss:0.229, Lr:1.00E-04
Epoch:14, Train_acc:95.3%, Train_loss:0.133, Test_acc:95.4%, Test_loss:0.150, Lr:1.00E-04
Epoch:15, Train_acc:98.9%, Train_loss:0.047, Test_acc:96.7%, Test_loss:0.133, Lr:1.00E-04
Epoch:16, Train_acc:94.2%, Train_loss:0.153, Test_acc:90.8%, Test_loss:0.187, Lr:1.00E-04
Epoch:17, Train_acc:97.0%, Train_loss:0.074, Test_acc:97.9%, Test_loss:0.063, Lr:1.00E-04
Epoch:18, Train_acc:97.5%, Train_loss:0.059, Test_acc:97.1%, Test_loss:0.071, Lr:1.00E-04
Epoch:19, Train_acc:98.9%, Train_loss:0.029, Test_acc:96.7%, Test_loss:0.105, Lr:1.00E-04
Epoch:20, Train_acc:97.9%, Train_loss:0.051, Test_acc:97.1%, Test_loss:0.068, Lr:1.00E-04
Epoch:21, Train_acc:98.2%, Train_loss:0.043, Test_acc:97.9%, Test_loss:0.061, Lr:1.00E-04
Epoch:22, Train_acc:98.2%, Train_loss:0.042, Test_acc:96.7%, Test_loss:0.131, Lr:1.00E-04
Epoch:23, Train_acc:99.3%, Train_loss:0.029, Test_acc:98.3%, Test_loss:0.049, Lr:1.00E-04
Epoch:24, Train_acc:99.7%, Train_loss:0.011, Test_acc:97.9%, Test_loss:0.084, Lr:1.00E-04
Epoch:25, Train_acc:99.7%, Train_loss:0.008, Test_acc:97.9%, Test_loss:0.048, Lr:1.00E-04
Epoch:26, Train_acc:95.9%, Train_loss:0.147, Test_acc:95.8%, Test_loss:0.127, Lr:1.00E-04
Epoch:27, Train_acc:99.0%, Train_loss:0.044, Test_acc:96.7%, Test_loss:0.057, Lr:1.00E-04
Epoch:28, Train_acc:95.0%, Train_loss:0.161, Test_acc:91.7%, Test_loss:0.226, Lr:1.00E-04
Epoch:29, Train_acc:98.5%, Train_loss:0.049, Test_acc:96.2%, Test_loss:0.100, Lr:1.00E-04
Epoch:30, Train_acc:99.4%, Train_loss:0.024, Test_acc:97.5%, Test_loss:0.054, Lr:1.00E-04
Epoch:31, Train_acc:99.2%, Train_loss:0.018, Test_acc:96.2%, Test_loss:0.136, Lr:1.00E-04
Epoch:32, Train_acc:97.8%, Train_loss:0.060, Test_acc:96.2%, Test_loss:0.104, Lr:1.00E-04
Epoch:33, Train_acc:97.3%, Train_loss:0.078, Test_acc:94.6%, Test_loss:0.157, Lr:1.00E-04
Epoch:34, Train_acc:98.1%, Train_loss:0.059, Test_acc:95.8%, Test_loss:0.350, Lr:1.00E-04
Epoch:35, Train_acc:98.3%, Train_loss:0.038, Test_acc:97.5%, Test_loss:0.128, Lr:1.00E-04
Epoch:36, Train_acc:99.1%, Train_loss:0.030, Test_acc:97.9%, Test_loss:0.061, Lr:1.00E-04
Epoch:37, Train_acc:98.6%, Train_loss:0.039, Test_acc:96.2%, Test_loss:0.116, Lr:1.00E-04
Epoch:38, Train_acc:98.6%, Train_loss:0.040, Test_acc:96.2%, Test_loss:0.136, Lr:1.00E-04
Epoch:39, Train_acc:99.9%, Train_loss:0.005, Test_acc:97.9%, Test_loss:0.070, Lr:1.00E-04
Epoch:40, Train_acc:97.8%, Train_loss:0.082, Test_acc:96.7%, Test_loss:0.129, Lr:1.00E-04
Done
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值