第R1周:RNN-心脏病预测

  🍨 本文为🔗365天深度学习训练营 中的学习记录博客

  🍦 参考文章:365天深度学习训练营-第R1周:RNN-心脏病预测(训练营内部成员可读)

  🍖 原作者:K同学啊|接辅导、项目定制

🍺难度:新手入门
🍺要求:
1.本地读取并加载数据。
2.了解循环神经网络(RNN)的构建过程
3.测试集accuracy到达87%

 🏡 运行环境:
电脑系统:Windows 10
语言环境:python 3.10
编译器:Pycharm 2022.1.1
深度学习环境:Pytorch 


目录

一、前期准备

1.设置GPU 

2.导入数据

3.检查数据 

二、数据预处理

1.划分训练集与测试集

2.标准化

三、构建RNN模型 

四、编译模型

五、训练模型

六、模型评估


一、前期准备

1.设置GPU

import tensorflow as tf
gpus = tf.config.list_physical_devices( "GPU")
if gpus:
      gpu0 = gpus[0]
#如果有多个GPU,仅使用第0个GPU
      tf. config.experimental.set_memory_growth(gpu0, True) #设 置GPU显存用量按需使用
      tf. config.set_visible_devices([gpu0], "GPU")
gpus

  (电脑在设置GPU时老是出错!!!)

 2.导入数据

数据介绍:


· age:1)年龄
· sex:2)性别
· cp: 3)胸痛类型(4 values)
· trestbps: 4)静息血压
· chol: 5) 血清胆甾醇(mg/dI
· fbs: 6)空腹血糖> 120 mg/dl
· restecg: 7)静息心电图结果(值0,1 ,2)
· thalach: 8)达到的最大心率
· exang: 9) 运动诱发的心绞痛
. oldpeak: 10) 相对于静止状态,运动引起的ST段压低
· slope: 11)运动峰值ST段的斜率
· ca: 12)荧光透视着色的主要血管数量(0-3)
· thal: 13)0=正常; 1 =固定缺陷; 2 =可逆转的缺陷
· target: 14) 0 =心脏病发作的几率较小1 =心脏病发作的几率更大


import pandas as pd
import numpy as np

df = pd.read_csv("F:\\布尔津\\heart.csv")
df
agesexcptrestbpscholfbsrestecgthalachexangoldpeakslopecathaltarget
063131452331015002.30011
137121302500118703.50021
241011302040017201.42021
356111202360117800.82021
457001203540116310.62021
557101401920114800.41011
656011402940015301.31021
744111202630117300.02031
852121721991116200.52031
957121501680117401.62021
1054101402390116001.22021
1148021302750113900.22021
1249111302660117100.62021
1364131102110014411.81021
1458031502831016201.02021
1550021202190115801.61021
1658021203400117200.02021
1766031502260111402.60021
1843101502470117101.52021
1969031402390115101.82221
2059101352340116100.51031
2144121302330117910.42021
2242101402260117800.02021
2361121502431113711.01021
2440131401990117811.42031
2571011603020116200.42221
2659121502121115701.62021
2751121101750112300.62021
2865021404171015700.82121
2953121301971015201.20021
.............................................
27358101002340115600.12130
27447101102750011811.01120
27552101252120116801.02230
27658101462180110502.01130
27757111242610114100.32030
27858011363191015200.02220
27961101381660012513.61120
28042101363150112511.81010
28152101282041115611.01000
28259121262181113402.21110
28340101522230118100.02030
28461101402070013811.92130
28546101403110112011.81230
28659131342040116200.82220
28757111542320016400.02120
28857101103350114313.01130
28955001282050213012.01130
29061101482030116100.02130
29158101143180214004.40310
29258001702251014612.81210
29367121522120015000.81030
29444101201690114412.80010
29563101401870014414.02230
29663001241970113610.01020
2975910164176109001.01210
29857001402410112310.21030
29945131102640113201.21030
30068101441931114103.41230
30157101301310111511.21130
30257011302360017400.01120

303 rows × 14 columns

3.检查数据 

#检查是否有空值
df.isnull().sum()
age         0
sex         0
cp          0
trestbps    0
chol        0
fbs         0
restecg     0
thalach     0
exang       0
oldpeak     0
slope       0
ca          0
thal        0
target      0
dtype: int64

二、数据预处理

1.划分训练集与测试集


 🍺测试集与验证集的关系:
1.验证集并没有参与训练过程梯度下降过程的,狭义上来讲是没有参与模型的参数训练更新的。
2.但是广义上来讲,验证集存在的意义确实参与了一个“人工调参”的过程,我们根据每一个epoch训练之后模型在valid data上的表现来决定是否需要训练进行early stop,或者根据这个过程模型的性能变化来调整模型的超参数,如学习率,batch._size等等。
3.我们也可以认为,验证集也参与了训练,但是并没有使得模型去overfit验证集。


from sklearn.preprocessing import StandardScaler
from sklearn.model_selection import train_test_split

X = df.iloc[:,:-1]
y = df.iloc[:,-1]

X_train,X_test,y_train,y_train = train_test_split(X,y,test_size = 0.1,random_state=1)

X_train.shape,y_train.shape
((272, 13), (31,))

2.标准化

# 将每一列特征标准化为标准正态分布(针对每一列而言)
sc = StandardScaler()
X_train = sc.fit_transform(X_train)
X_test = sc.transform(X_test)

X_train = X_train.reshape(X_train.shape[0],X_train.shape[1],1)
X_test = X_test.reshape(X_test.shape[0],X_test.shape[1],1)

三、构建RNN模型 

🍺函数原型

 tf.keras.layers.SimpleRNN(untis,activation='tanh',use_bias=True,kernel_initalize='glorot_uniform',

recurrent_initializer='orthogonal',bias_initializer='zeros',kernel_regularizer=None,recurrent_regularizer=None,

bias_regularizer=None,activity_regularizer=None,kernel_constraint=None,recurrent_constraint=None

bias_constraint=None,dropout=0.0,recurrent_dropout=0.0,return_sequences=False,return_state=False,

go_backwards=False,stateful=False,unroll=False,**kwargs)

🍺关键参数说明:


· units:正整数,输出空间的维度。
· activation:要使用的激活函数;默认: 双曲正切(tanh) 。如果传入None,则不使用激活函数(即线性激活: a(x)=x)。
· use_ bias: 布尔值,该误否使用偏置向量。
· kernel initializer: kernel 权值矩阵的初始化器, 用于输入的线性转换(详见initializers)。
· recurrent initializer: recurrent kernel 权值矩阵的初始化器,用于循环层状态的线性转换(详见initializers)。
· bias_initializer:偏置向量的初始化器(详见initializers)。
· dropout: 在0和1之间的浮点数。单元的丢弃比例,于输入的线性转换。


import tensorflow
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense,LSTM,SimpleRNN

model = Sequential()
model.add(SimpleRNN(200,input_shape=(13,1),activation='relu'))
model.add(Dense(100,activation='relu'))
model.add(Dense(1,activation='sigmoid'))
model. Summary()

Model: "sequential_1"
_________________________________________________________________
 Layer (type)                Output Shape              Param #   
=================================================================
 simple_rnn_1 (SimpleRNN)    (None, 200)               40400                                                               
 dense_2 (Dense)             (None, 100)               20100     
                                                                 
 dense_3 (Dense)             (None, 1)                 101       
                                                                 
=================================================================
Total params: 60,601
Trainable params: 60,601
Non-trainable params: 0


四、编译模型

opt = tf.keras.optimizers.Adam(learning_rate=1e-4)

model.compile(loss='binary_crossentropy',
              optimizer=opt,
              metrics="accuracy")

五、训练模型

epochs = 100

history = model.fit(X_train, y_train, 
                    epochs=epochs, 
                    batch_size=128, 
                    validation_data=(X_test, y_test),
                    verbose=1)

Epoch 1/100
3/3 [==============================] - 1s 164ms/step - loss: 0.6844 - accuracy: 0.5441 - val_loss: 0.6815 - val_accuracy: 0.4839
Epoch 2/100
3/3 [==============================] - 0s 28ms/step - loss: 0.6759 - accuracy: 0.5882 - val_loss: 0.6658 - val_accuracy: 0.5484
Epoch 3/100
3/3 [==============================] - 0s 26ms/step - loss: 0.6678 - accuracy: 0.6654 - val_loss: 0.6518 - val_accuracy: 0.7742
Epoch 4/100
3/3 [==============================] - 0s 22ms/step - loss: 0.6606 - accuracy: 0.7279 - val_loss: 0.6384 - val_accuracy: 0.8710
Epoch 5/100
3/3 [==============================] - 0s 21ms/step - loss: 0.6535 - accuracy: 0.7390 - val_loss: 0.6257 - val_accuracy: 0.8710
Epoch 6/100
3/3 [==============================] - 0s 21ms/step - loss: 0.6464 - accuracy: 0.7353 - val_loss: 0.6135 - val_accuracy: 0.8710
Epoch 7/100
3/3 [==============================] - 0s 22ms/step - loss: 0.6392 - accuracy: 0.7426 - val_loss: 0.6017 - val_accuracy: 0.8710
Epoch 8/100
3/3 [==============================] - 0s 21ms/step - loss: 0.6321 - accuracy: 0.7574 - val_loss: 0.5896 - val_accuracy: 0.8710
Epoch 9/100
3/3 [==============================] - 0s 25ms/step - loss: 0.6244 - accuracy: 0.7610 - val_loss: 0.5774 - val_accuracy: 0.9032
Epoch 10/100
3/3 [==============================] - 0s 22ms/step - loss: 0.6164 - accuracy: 0.7721 - val_loss: 0.5644 - val_accuracy: 0.9032
Epoch 11/100
3/3 [==============================] - 0s 20ms/step - loss: 0.6080 - accuracy: 0.7794 - val_loss: 0.5509 - val_accuracy: 0.9032
Epoch 12/100
3/3 [==============================] - 0s 23ms/step - loss: 0.5994 - accuracy: 0.7794 - val_loss: 0.5368 - val_accuracy: 0.9032
Epoch 13/100
3/3 [==============================] - 0s 24ms/step - loss: 0.5896 - accuracy: 0.7757 - val_loss: 0.5218 - val_accuracy: 0.9032
Epoch 14/100
3/3 [==============================] - 0s 22ms/step - loss: 0.5796 - accuracy: 0.7831 - val_loss: 0.5060 - val_accuracy: 0.9032
Epoch 15/100
3/3 [==============================] - 0s 22ms/step - loss: 0.5687 - accuracy: 0.7794 - val_loss: 0.4889 - val_accuracy: 0.9032
Epoch 16/100
3/3 [==============================] - 0s 23ms/step - loss: 0.5575 - accuracy: 0.7868 - val_loss: 0.4715 - val_accuracy: 0.8710
Epoch 17/100
3/3 [==============================] - 0s 23ms/step - loss: 0.5451 - accuracy: 0.7868 - val_loss: 0.4542 - val_accuracy: 0.8387
Epoch 18/100
3/3 [==============================] - 0s 30ms/step - loss: 0.5328 - accuracy: 0.7831 - val_loss: 0.4357 - val_accuracy: 0.8387
Epoch 19/100
3/3 [==============================] - 0s 29ms/step - loss: 0.5208 - accuracy: 0.7794 - val_loss: 0.4169 - val_accuracy: 0.8710
Epoch 20/100
3/3 [==============================] - 0s 29ms/step - loss: 0.5081 - accuracy: 0.7904 - val_loss: 0.3994 - val_accuracy: 0.8710
Epoch 21/100
3/3 [==============================] - 0s 25ms/step - loss: 0.4945 - accuracy: 0.7978 - val_loss: 0.3828 - val_accuracy: 0.8710
Epoch 22/100
3/3 [==============================] - 0s 22ms/step - loss: 0.4828 - accuracy: 0.8051 - val_loss: 0.3666 - val_accuracy: 0.8710
Epoch 23/100
3/3 [==============================] - 0s 23ms/step - loss: 0.4709 - accuracy: 0.8088 - val_loss: 0.3527 - val_accuracy: 0.8710
Epoch 24/100
3/3 [==============================] - 0s 29ms/step - loss: 0.4618 - accuracy: 0.8125 - val_loss: 0.3408 - val_accuracy: 0.8710
Epoch 25/100
3/3 [==============================] - 0s 26ms/step - loss: 0.4533 - accuracy: 0.8125 - val_loss: 0.3301 - val_accuracy: 0.8710
Epoch 26/100
3/3 [==============================] - 0s 21ms/step - loss: 0.4443 - accuracy: 0.8088 - val_loss: 0.3202 - val_accuracy: 0.8710
Epoch 27/100
3/3 [==============================] - 0s 23ms/step - loss: 0.4384 - accuracy: 0.8051 - val_loss: 0.3115 - val_accuracy: 0.8710
Epoch 28/100
3/3 [==============================] - 0s 24ms/step - loss: 0.4328 - accuracy: 0.8015 - val_loss: 0.3045 - val_accuracy: 0.8710
Epoch 29/100
3/3 [==============================] - 0s 32ms/step - loss: 0.4288 - accuracy: 0.8051 - val_loss: 0.2966 - val_accuracy: 0.8710
Epoch 30/100
3/3 [==============================] - 0s 34ms/step - loss: 0.4221 - accuracy: 0.8088 - val_loss: 0.2919 - val_accuracy: 0.8710
Epoch 31/100
3/3 [==============================] - 0s 24ms/step - loss: 0.4221 - accuracy: 0.8125 - val_loss: 0.2903 - val_accuracy: 0.8710
Epoch 32/100
3/3 [==============================] - 0s 27ms/step - loss: 0.4183 - accuracy: 0.8199 - val_loss: 0.2904 - val_accuracy: 0.8710
Epoch 33/100
3/3 [==============================] - 0s 21ms/step - loss: 0.4140 - accuracy: 0.8125 - val_loss: 0.2911 - val_accuracy: 0.8710
Epoch 34/100
3/3 [==============================] - 0s 22ms/step - loss: 0.4096 - accuracy: 0.8125 - val_loss: 0.2907 - val_accuracy: 0.8710
Epoch 35/100
3/3 [==============================] - 0s 23ms/step - loss: 0.4060 - accuracy: 0.8088 - val_loss: 0.2894 - val_accuracy: 0.8710
Epoch 36/100
3/3 [==============================] - 0s 22ms/step - loss: 0.4027 - accuracy: 0.8162 - val_loss: 0.2871 - val_accuracy: 0.8710
Epoch 37/100
3/3 [==============================] - 0s 19ms/step - loss: 0.3997 - accuracy: 0.8162 - val_loss: 0.2836 - val_accuracy: 0.8710
Epoch 38/100
3/3 [==============================] - 0s 21ms/step - loss: 0.3953 - accuracy: 0.8235 - val_loss: 0.2810 - val_accuracy: 0.8710
Epoch 39/100
3/3 [==============================] - 0s 20ms/step - loss: 0.3929 - accuracy: 0.8309 - val_loss: 0.2774 - val_accuracy: 0.8710
Epoch 40/100
3/3 [==============================] - 0s 21ms/step - loss: 0.3911 - accuracy: 0.8235 - val_loss: 0.2736 - val_accuracy: 0.8710
Epoch 41/100
3/3 [==============================] - 0s 20ms/step - loss: 0.3872 - accuracy: 0.8272 - val_loss: 0.2731 - val_accuracy: 0.8710
Epoch 42/100
3/3 [==============================] - 0s 24ms/step - loss: 0.3861 - accuracy: 0.8272 - val_loss: 0.2766 - val_accuracy: 0.8710
Epoch 43/100
3/3 [==============================] - 0s 26ms/step - loss: 0.3901 - accuracy: 0.8235 - val_loss: 0.2805 - val_accuracy: 0.9032
Epoch 44/100
3/3 [==============================] - 0s 25ms/step - loss: 0.3913 - accuracy: 0.8235 - val_loss: 0.2732 - val_accuracy: 0.9032
Epoch 45/100
3/3 [==============================] - 0s 25ms/step - loss: 0.3836 - accuracy: 0.8309 - val_loss: 0.2665 - val_accuracy: 0.8710
Epoch 46/100
3/3 [==============================] - 0s 25ms/step - loss: 0.3770 - accuracy: 0.8272 - val_loss: 0.2651 - val_accuracy: 0.8710
Epoch 47/100
3/3 [==============================] - 0s 19ms/step - loss: 0.3741 - accuracy: 0.8272 - val_loss: 0.2652 - val_accuracy: 0.9032
Epoch 48/100
3/3 [==============================] - 0s 22ms/step - loss: 0.3716 - accuracy: 0.8346 - val_loss: 0.2661 - val_accuracy: 0.9032
Epoch 49/100
3/3 [==============================] - 0s 22ms/step - loss: 0.3692 - accuracy: 0.8346 - val_loss: 0.2658 - val_accuracy: 0.9032
Epoch 50/100
3/3 [==============================] - 0s 25ms/step - loss: 0.3665 - accuracy: 0.8346 - val_loss: 0.2639 - val_accuracy: 0.8710
Epoch 51/100
3/3 [==============================] - 0s 25ms/step - loss: 0.3633 - accuracy: 0.8346 - val_loss: 0.2621 - val_accuracy: 0.8387
Epoch 52/100
3/3 [==============================] - 0s 23ms/step - loss: 0.3635 - accuracy: 0.8272 - val_loss: 0.2614 - val_accuracy: 0.8387
Epoch 53/100
3/3 [==============================] - 0s 26ms/step - loss: 0.3612 - accuracy: 0.8346 - val_loss: 0.2608 - val_accuracy: 0.8387
Epoch 54/100
3/3 [==============================] - 0s 25ms/step - loss: 0.3577 - accuracy: 0.8346 - val_loss: 0.2613 - val_accuracy: 0.8710
Epoch 55/100
3/3 [==============================] - 0s 21ms/step - loss: 0.3544 - accuracy: 0.8419 - val_loss: 0.2625 - val_accuracy: 0.8710
Epoch 56/100
3/3 [==============================] - 0s 20ms/step - loss: 0.3533 - accuracy: 0.8382 - val_loss: 0.2694 - val_accuracy: 0.9032
Epoch 57/100
3/3 [==============================] - 0s 21ms/step - loss: 0.3579 - accuracy: 0.8346 - val_loss: 0.2776 - val_accuracy: 0.9032
Epoch 58/100
3/3 [==============================] - 0s 21ms/step - loss: 0.3582 - accuracy: 0.8419 - val_loss: 0.2741 - val_accuracy: 0.9032
Epoch 59/100
3/3 [==============================] - 0s 20ms/step - loss: 0.3503 - accuracy: 0.8382 - val_loss: 0.2702 - val_accuracy: 0.8710
Epoch 60/100
3/3 [==============================] - 0s 20ms/step - loss: 0.3456 - accuracy: 0.8456 - val_loss: 0.2694 - val_accuracy: 0.8710
Epoch 61/100
3/3 [==============================] - 0s 22ms/step - loss: 0.3442 - accuracy: 0.8456 - val_loss: 0.2715 - val_accuracy: 0.8710
Epoch 62/100
3/3 [==============================] - 0s 20ms/step - loss: 0.3457 - accuracy: 0.8529 - val_loss: 0.2726 - val_accuracy: 0.8710
Epoch 63/100
3/3 [==============================] - 0s 20ms/step - loss: 0.3433 - accuracy: 0.8529 - val_loss: 0.2720 - val_accuracy: 0.8710
Epoch 64/100
3/3 [==============================] - 0s 19ms/step - loss: 0.3387 - accuracy: 0.8493 - val_loss: 0.2718 - val_accuracy: 0.8710
Epoch 65/100
3/3 [==============================] - 0s 21ms/step - loss: 0.3332 - accuracy: 0.8382 - val_loss: 0.2753 - val_accuracy: 0.9032
Epoch 66/100
3/3 [==============================] - 0s 22ms/step - loss: 0.3311 - accuracy: 0.8456 - val_loss: 0.2795 - val_accuracy: 0.9032
Epoch 67/100
3/3 [==============================] - 0s 19ms/step - loss: 0.3290 - accuracy: 0.8529 - val_loss: 0.2776 - val_accuracy: 0.9032
Epoch 68/100
3/3 [==============================] - 0s 20ms/step - loss: 0.3240 - accuracy: 0.8529 - val_loss: 0.2758 - val_accuracy: 0.9032
Epoch 69/100
3/3 [==============================] - 0s 18ms/step - loss: 0.3220 - accuracy: 0.8566 - val_loss: 0.2764 - val_accuracy: 0.9032
Epoch 70/100
3/3 [==============================] - 0s 19ms/step - loss: 0.3203 - accuracy: 0.8603 - val_loss: 0.2774 - val_accuracy: 0.9032
Epoch 71/100
3/3 [==============================] - 0s 19ms/step - loss: 0.3177 - accuracy: 0.8603 - val_loss: 0.2775 - val_accuracy: 0.9032
Epoch 72/100
3/3 [==============================] - 0s 19ms/step - loss: 0.3127 - accuracy: 0.8603 - val_loss: 0.2802 - val_accuracy: 0.9032
Epoch 73/100
3/3 [==============================] - 0s 19ms/step - loss: 0.3109 - accuracy: 0.8676 - val_loss: 0.2860 - val_accuracy: 0.9032
Epoch 74/100
3/3 [==============================] - 0s 21ms/step - loss: 0.3115 - accuracy: 0.8603 - val_loss: 0.2897 - val_accuracy: 0.9032
Epoch 75/100
3/3 [==============================] - 0s 20ms/step - loss: 0.3122 - accuracy: 0.8603 - val_loss: 0.2909 - val_accuracy: 0.9032
Epoch 76/100
3/3 [==============================] - 0s 19ms/step - loss: 0.3085 - accuracy: 0.8640 - val_loss: 0.2871 - val_accuracy: 0.9032
Epoch 77/100
3/3 [==============================] - 0s 20ms/step - loss: 0.3025 - accuracy: 0.8676 - val_loss: 0.2846 - val_accuracy: 0.9032
Epoch 78/100
3/3 [==============================] - 0s 18ms/step - loss: 0.2996 - accuracy: 0.8750 - val_loss: 0.2878 - val_accuracy: 0.9032
Epoch 79/100
3/3 [==============================] - 0s 20ms/step - loss: 0.2968 - accuracy: 0.8713 - val_loss: 0.2926 - val_accuracy: 0.9032
Epoch 80/100
3/3 [==============================] - 0s 22ms/step - loss: 0.2946 - accuracy: 0.8713 - val_loss: 0.2948 - val_accuracy: 0.9032
Epoch 81/100
3/3 [==============================] - 0s 20ms/step - loss: 0.2930 - accuracy: 0.8750 - val_loss: 0.2952 - val_accuracy: 0.9032
Epoch 82/100
3/3 [==============================] - 0s 20ms/step - loss: 0.2908 - accuracy: 0.8750 - val_loss: 0.2988 - val_accuracy: 0.9032
Epoch 83/100
3/3 [==============================] - 0s 20ms/step - loss: 0.2883 - accuracy: 0.8713 - val_loss: 0.3004 - val_accuracy: 0.9032
Epoch 84/100
3/3 [==============================] - 0s 18ms/step - loss: 0.2864 - accuracy: 0.8787 - val_loss: 0.2977 - val_accuracy: 0.9032
Epoch 85/100
3/3 [==============================] - 0s 19ms/step - loss: 0.2887 - accuracy: 0.8750 - val_loss: 0.3012 - val_accuracy: 0.9032
Epoch 86/100
3/3 [==============================] - 0s 19ms/step - loss: 0.2885 - accuracy: 0.8676 - val_loss: 0.2970 - val_accuracy: 0.9032
Epoch 87/100
3/3 [==============================] - 0s 20ms/step - loss: 0.2825 - accuracy: 0.8787 - val_loss: 0.2944 - val_accuracy: 0.9032
Epoch 88/100
3/3 [==============================] - 0s 18ms/step - loss: 0.2781 - accuracy: 0.8860 - val_loss: 0.2947 - val_accuracy: 0.9032
Epoch 89/100
3/3 [==============================] - 0s 22ms/step - loss: 0.2765 - accuracy: 0.8897 - val_loss: 0.2953 - val_accuracy: 0.9032
Epoch 90/100
3/3 [==============================] - 0s 20ms/step - loss: 0.2760 - accuracy: 0.8860 - val_loss: 0.3000 - val_accuracy: 0.9032
Epoch 91/100
3/3 [==============================] - 0s 23ms/step - loss: 0.2740 - accuracy: 0.8860 - val_loss: 0.3071 - val_accuracy: 0.9032
Epoch 92/100
3/3 [==============================] - 0s 19ms/step - loss: 0.2714 - accuracy: 0.8860 - val_loss: 0.3139 - val_accuracy: 0.9032
Epoch 93/100
3/3 [==============================] - 0s 19ms/step - loss: 0.2698 - accuracy: 0.8860 - val_loss: 0.3157 - val_accuracy: 0.8710
Epoch 94/100
3/3 [==============================] - 0s 20ms/step - loss: 0.2707 - accuracy: 0.8897 - val_loss: 0.3160 - val_accuracy: 0.8710
Epoch 95/100
3/3 [==============================] - 0s 20ms/step - loss: 0.2684 - accuracy: 0.8860 - val_loss: 0.3093 - val_accuracy: 0.8710
Epoch 96/100
3/3 [==============================] - 0s 19ms/step - loss: 0.2648 - accuracy: 0.8897 - val_loss: 0.2961 - val_accuracy: 0.8710
Epoch 97/100
3/3 [==============================] - 0s 22ms/step - loss: 0.2641 - accuracy: 0.8971 - val_loss: 0.2897 - val_accuracy: 0.8710
Epoch 98/100
3/3 [==============================] - 0s 19ms/step - loss: 0.2598 - accuracy: 0.8971 - val_loss: 0.2922 - val_accuracy: 0.9032
Epoch 99/100
3/3 [==============================] - 0s 18ms/step - loss: 0.2621 - accuracy: 0.8860 - val_loss: 0.3007 - val_accuracy: 0.9032
Epoch 100/100
3/3 [==============================] - 0s 22ms/step - loss: 0.2615 - accuracy: 0.8787 - val_loss: 0.2992 - val_accuracy: 0.9032


六、模型评估

import matplotlib.pyplot as plt

acc = history.history['accuracy']
val_acc = history.history['val_accuracy']

loss = history.history['loss']
val_loss = history.history['val_loss']

epochs_range = range(epochs)

plt.figure(figsize=(14, 4))
plt.subplot(1, 2, 1)

plt.plot(epochs_range, acc, label='Training Accuracy')
plt.plot(epochs_range, val_acc, label='Validation Accuracy')
plt.legend(loc='lower right')
plt.title('Training and Validation Accuracy')

plt.subplot(1, 2, 2)
plt.plot(epochs_range, loss, label='Training Loss')
plt.plot(epochs_range, val_loss, label='Validation Loss')
plt.legend(loc='upper right')
plt.title('Training and Validation Loss')
plt.show()

scores = model.evaluate(X_test, y_test, verbose=0)
print("%s: %.2f%%" % (model.metrics_names[1], scores[1]*100))

 accuracy: 90.32%

  • 0
    点赞
  • 3
    收藏
    觉得还不错? 一键收藏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值