🍨 本文为🔗365天深度学习训练营 中的学习记录博客
🍦 参考文章:365天深度学习训练营-第R1周:RNN-心脏病预测(训练营内部成员可读)
🍖 原作者:K同学啊|接辅导、项目定制
🍺难度:新手入门
🍺要求:
1.本地读取并加载数据。
2.了解循环神经网络(RNN)的构建过程
3.测试集accuracy到达87%
🏡 运行环境:
电脑系统:Windows 10
语言环境:python 3.10
编译器:Pycharm 2022.1.1
深度学习环境:Pytorch
目录
一、前期准备
1.设置GPU
import tensorflow as tf
gpus = tf.config.list_physical_devices( "GPU")
if gpus:
gpu0 = gpus[0]
#如果有多个GPU,仅使用第0个GPU
tf. config.experimental.set_memory_growth(gpu0, True) #设 置GPU显存用量按需使用
tf. config.set_visible_devices([gpu0], "GPU")
gpus
(电脑在设置GPU时老是出错!!!)
2.导入数据
数据介绍:
· age:1)年龄
· sex:2)性别
· cp: 3)胸痛类型(4 values)
· trestbps: 4)静息血压
· chol: 5) 血清胆甾醇(mg/dI
· fbs: 6)空腹血糖> 120 mg/dl
· restecg: 7)静息心电图结果(值0,1 ,2)
· thalach: 8)达到的最大心率
· exang: 9) 运动诱发的心绞痛
. oldpeak: 10) 相对于静止状态,运动引起的ST段压低
· slope: 11)运动峰值ST段的斜率
· ca: 12)荧光透视着色的主要血管数量(0-3)
· thal: 13)0=正常; 1 =固定缺陷; 2 =可逆转的缺陷
· target: 14) 0 =心脏病发作的几率较小1 =心脏病发作的几率更大
import pandas as pd
import numpy as np
df = pd.read_csv("F:\\布尔津\\heart.csv")
df
age sex cp trestbps chol fbs restecg thalach exang oldpeak slope ca thal target 0 63 1 3 145 233 1 0 150 0 2.3 0 0 1 1 1 37 1 2 130 250 0 1 187 0 3.5 0 0 2 1 2 41 0 1 130 204 0 0 172 0 1.4 2 0 2 1 3 56 1 1 120 236 0 1 178 0 0.8 2 0 2 1 4 57 0 0 120 354 0 1 163 1 0.6 2 0 2 1 5 57 1 0 140 192 0 1 148 0 0.4 1 0 1 1 6 56 0 1 140 294 0 0 153 0 1.3 1 0 2 1 7 44 1 1 120 263 0 1 173 0 0.0 2 0 3 1 8 52 1 2 172 199 1 1 162 0 0.5 2 0 3 1 9 57 1 2 150 168 0 1 174 0 1.6 2 0 2 1 10 54 1 0 140 239 0 1 160 0 1.2 2 0 2 1 11 48 0 2 130 275 0 1 139 0 0.2 2 0 2 1 12 49 1 1 130 266 0 1 171 0 0.6 2 0 2 1 13 64 1 3 110 211 0 0 144 1 1.8 1 0 2 1 14 58 0 3 150 283 1 0 162 0 1.0 2 0 2 1 15 50 0 2 120 219 0 1 158 0 1.6 1 0 2 1 16 58 0 2 120 340 0 1 172 0 0.0 2 0 2 1 17 66 0 3 150 226 0 1 114 0 2.6 0 0 2 1 18 43 1 0 150 247 0 1 171 0 1.5 2 0 2 1 19 69 0 3 140 239 0 1 151 0 1.8 2 2 2 1 20 59 1 0 135 234 0 1 161 0 0.5 1 0 3 1 21 44 1 2 130 233 0 1 179 1 0.4 2 0 2 1 22 42 1 0 140 226 0 1 178 0 0.0 2 0 2 1 23 61 1 2 150 243 1 1 137 1 1.0 1 0 2 1 24 40 1 3 140 199 0 1 178 1 1.4 2 0 3 1 25 71 0 1 160 302 0 1 162 0 0.4 2 2 2 1 26 59 1 2 150 212 1 1 157 0 1.6 2 0 2 1 27 51 1 2 110 175 0 1 123 0 0.6 2 0 2 1 28 65 0 2 140 417 1 0 157 0 0.8 2 1 2 1 29 53 1 2 130 197 1 0 152 0 1.2 0 0 2 1 ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... 273 58 1 0 100 234 0 1 156 0 0.1 2 1 3 0 274 47 1 0 110 275 0 0 118 1 1.0 1 1 2 0 275 52 1 0 125 212 0 1 168 0 1.0 2 2 3 0 276 58 1 0 146 218 0 1 105 0 2.0 1 1 3 0 277 57 1 1 124 261 0 1 141 0 0.3 2 0 3 0 278 58 0 1 136 319 1 0 152 0 0.0 2 2 2 0 279 61 1 0 138 166 0 0 125 1 3.6 1 1 2 0 280 42 1 0 136 315 0 1 125 1 1.8 1 0 1 0 281 52 1 0 128 204 1 1 156 1 1.0 1 0 0 0 282 59 1 2 126 218 1 1 134 0 2.2 1 1 1 0 283 40 1 0 152 223 0 1 181 0 0.0 2 0 3 0 284 61 1 0 140 207 0 0 138 1 1.9 2 1 3 0 285 46 1 0 140 311 0 1 120 1 1.8 1 2 3 0 286 59 1 3 134 204 0 1 162 0 0.8 2 2 2 0 287 57 1 1 154 232 0 0 164 0 0.0 2 1 2 0 288 57 1 0 110 335 0 1 143 1 3.0 1 1 3 0 289 55 0 0 128 205 0 2 130 1 2.0 1 1 3 0 290 61 1 0 148 203 0 1 161 0 0.0 2 1 3 0 291 58 1 0 114 318 0 2 140 0 4.4 0 3 1 0 292 58 0 0 170 225 1 0 146 1 2.8 1 2 1 0 293 67 1 2 152 212 0 0 150 0 0.8 1 0 3 0 294 44 1 0 120 169 0 1 144 1 2.8 0 0 1 0 295 63 1 0 140 187 0 0 144 1 4.0 2 2 3 0 296 63 0 0 124 197 0 1 136 1 0.0 1 0 2 0 297 59 1 0 164 176 1 0 90 0 1.0 1 2 1 0 298 57 0 0 140 241 0 1 123 1 0.2 1 0 3 0 299 45 1 3 110 264 0 1 132 0 1.2 1 0 3 0 300 68 1 0 144 193 1 1 141 0 3.4 1 2 3 0 301 57 1 0 130 131 0 1 115 1 1.2 1 1 3 0 302 57 0 1 130 236 0 0 174 0 0.0 1 1 2 0 303 rows × 14 columns
3.检查数据
#检查是否有空值
df.isnull().sum()
age 0 sex 0 cp 0 trestbps 0 chol 0 fbs 0 restecg 0 thalach 0 exang 0 oldpeak 0 slope 0 ca 0 thal 0 target 0 dtype: int64
二、数据预处理
1.划分训练集与测试集
🍺测试集与验证集的关系:
1.验证集并没有参与训练过程梯度下降过程的,狭义上来讲是没有参与模型的参数训练更新的。
2.但是广义上来讲,验证集存在的意义确实参与了一个“人工调参”的过程,我们根据每一个epoch训练之后模型在valid data上的表现来决定是否需要训练进行early stop,或者根据这个过程模型的性能变化来调整模型的超参数,如学习率,batch._size等等。
3.我们也可以认为,验证集也参与了训练,但是并没有使得模型去overfit验证集。
from sklearn.preprocessing import StandardScaler
from sklearn.model_selection import train_test_split
X = df.iloc[:,:-1]
y = df.iloc[:,-1]
X_train,X_test,y_train,y_train = train_test_split(X,y,test_size = 0.1,random_state=1)
X_train.shape,y_train.shape
((272, 13), (31,))
2.标准化
# 将每一列特征标准化为标准正态分布(针对每一列而言)
sc = StandardScaler()
X_train = sc.fit_transform(X_train)
X_test = sc.transform(X_test)
X_train = X_train.reshape(X_train.shape[0],X_train.shape[1],1)
X_test = X_test.reshape(X_test.shape[0],X_test.shape[1],1)
三、构建RNN模型
🍺函数原型
tf.keras.layers.SimpleRNN(untis,activation='tanh',use_bias=True,kernel_initalize='glorot_uniform',
recurrent_initializer='orthogonal',bias_initializer='zeros',kernel_regularizer=None,recurrent_regularizer=None,
bias_regularizer=None,activity_regularizer=None,kernel_constraint=None,recurrent_constraint=None
bias_constraint=None,dropout=0.0,recurrent_dropout=0.0,return_sequences=False,return_state=False,
go_backwards=False,stateful=False,unroll=False,**kwargs)
🍺关键参数说明:
· units:正整数,输出空间的维度。
· activation:要使用的激活函数;默认: 双曲正切(tanh) 。如果传入None,则不使用激活函数(即线性激活: a(x)=x)。
· use_ bias: 布尔值,该误否使用偏置向量。
· kernel initializer: kernel 权值矩阵的初始化器, 用于输入的线性转换(详见initializers)。
· recurrent initializer: recurrent kernel 权值矩阵的初始化器,用于循环层状态的线性转换(详见initializers)。
· bias_initializer:偏置向量的初始化器(详见initializers)。
· dropout: 在0和1之间的浮点数。单元的丢弃比例,于输入的线性转换。
import tensorflow
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense,LSTM,SimpleRNN
model = Sequential()
model.add(SimpleRNN(200,input_shape=(13,1),activation='relu'))
model.add(Dense(100,activation='relu'))
model.add(Dense(1,activation='sigmoid'))
model. Summary()
Model: "sequential_1"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
simple_rnn_1 (SimpleRNN) (None, 200) 40400
dense_2 (Dense) (None, 100) 20100
dense_3 (Dense) (None, 1) 101
=================================================================
Total params: 60,601
Trainable params: 60,601
Non-trainable params: 0
四、编译模型
opt = tf.keras.optimizers.Adam(learning_rate=1e-4)
model.compile(loss='binary_crossentropy',
optimizer=opt,
metrics="accuracy")
五、训练模型
epochs = 100
history = model.fit(X_train, y_train,
epochs=epochs,
batch_size=128,
validation_data=(X_test, y_test),
verbose=1)
Epoch 1/100
3/3 [==============================] - 1s 164ms/step - loss: 0.6844 - accuracy: 0.5441 - val_loss: 0.6815 - val_accuracy: 0.4839
Epoch 2/100
3/3 [==============================] - 0s 28ms/step - loss: 0.6759 - accuracy: 0.5882 - val_loss: 0.6658 - val_accuracy: 0.5484
Epoch 3/100
3/3 [==============================] - 0s 26ms/step - loss: 0.6678 - accuracy: 0.6654 - val_loss: 0.6518 - val_accuracy: 0.7742
Epoch 4/100
3/3 [==============================] - 0s 22ms/step - loss: 0.6606 - accuracy: 0.7279 - val_loss: 0.6384 - val_accuracy: 0.8710
Epoch 5/100
3/3 [==============================] - 0s 21ms/step - loss: 0.6535 - accuracy: 0.7390 - val_loss: 0.6257 - val_accuracy: 0.8710
Epoch 6/100
3/3 [==============================] - 0s 21ms/step - loss: 0.6464 - accuracy: 0.7353 - val_loss: 0.6135 - val_accuracy: 0.8710
Epoch 7/100
3/3 [==============================] - 0s 22ms/step - loss: 0.6392 - accuracy: 0.7426 - val_loss: 0.6017 - val_accuracy: 0.8710
Epoch 8/100
3/3 [==============================] - 0s 21ms/step - loss: 0.6321 - accuracy: 0.7574 - val_loss: 0.5896 - val_accuracy: 0.8710
Epoch 9/100
3/3 [==============================] - 0s 25ms/step - loss: 0.6244 - accuracy: 0.7610 - val_loss: 0.5774 - val_accuracy: 0.9032
Epoch 10/100
3/3 [==============================] - 0s 22ms/step - loss: 0.6164 - accuracy: 0.7721 - val_loss: 0.5644 - val_accuracy: 0.9032
Epoch 11/100
3/3 [==============================] - 0s 20ms/step - loss: 0.6080 - accuracy: 0.7794 - val_loss: 0.5509 - val_accuracy: 0.9032
Epoch 12/100
3/3 [==============================] - 0s 23ms/step - loss: 0.5994 - accuracy: 0.7794 - val_loss: 0.5368 - val_accuracy: 0.9032
Epoch 13/100
3/3 [==============================] - 0s 24ms/step - loss: 0.5896 - accuracy: 0.7757 - val_loss: 0.5218 - val_accuracy: 0.9032
Epoch 14/100
3/3 [==============================] - 0s 22ms/step - loss: 0.5796 - accuracy: 0.7831 - val_loss: 0.5060 - val_accuracy: 0.9032
Epoch 15/100
3/3 [==============================] - 0s 22ms/step - loss: 0.5687 - accuracy: 0.7794 - val_loss: 0.4889 - val_accuracy: 0.9032
Epoch 16/100
3/3 [==============================] - 0s 23ms/step - loss: 0.5575 - accuracy: 0.7868 - val_loss: 0.4715 - val_accuracy: 0.8710
Epoch 17/100
3/3 [==============================] - 0s 23ms/step - loss: 0.5451 - accuracy: 0.7868 - val_loss: 0.4542 - val_accuracy: 0.8387
Epoch 18/100
3/3 [==============================] - 0s 30ms/step - loss: 0.5328 - accuracy: 0.7831 - val_loss: 0.4357 - val_accuracy: 0.8387
Epoch 19/100
3/3 [==============================] - 0s 29ms/step - loss: 0.5208 - accuracy: 0.7794 - val_loss: 0.4169 - val_accuracy: 0.8710
Epoch 20/100
3/3 [==============================] - 0s 29ms/step - loss: 0.5081 - accuracy: 0.7904 - val_loss: 0.3994 - val_accuracy: 0.8710
Epoch 21/100
3/3 [==============================] - 0s 25ms/step - loss: 0.4945 - accuracy: 0.7978 - val_loss: 0.3828 - val_accuracy: 0.8710
Epoch 22/100
3/3 [==============================] - 0s 22ms/step - loss: 0.4828 - accuracy: 0.8051 - val_loss: 0.3666 - val_accuracy: 0.8710
Epoch 23/100
3/3 [==============================] - 0s 23ms/step - loss: 0.4709 - accuracy: 0.8088 - val_loss: 0.3527 - val_accuracy: 0.8710
Epoch 24/100
3/3 [==============================] - 0s 29ms/step - loss: 0.4618 - accuracy: 0.8125 - val_loss: 0.3408 - val_accuracy: 0.8710
Epoch 25/100
3/3 [==============================] - 0s 26ms/step - loss: 0.4533 - accuracy: 0.8125 - val_loss: 0.3301 - val_accuracy: 0.8710
Epoch 26/100
3/3 [==============================] - 0s 21ms/step - loss: 0.4443 - accuracy: 0.8088 - val_loss: 0.3202 - val_accuracy: 0.8710
Epoch 27/100
3/3 [==============================] - 0s 23ms/step - loss: 0.4384 - accuracy: 0.8051 - val_loss: 0.3115 - val_accuracy: 0.8710
Epoch 28/100
3/3 [==============================] - 0s 24ms/step - loss: 0.4328 - accuracy: 0.8015 - val_loss: 0.3045 - val_accuracy: 0.8710
Epoch 29/100
3/3 [==============================] - 0s 32ms/step - loss: 0.4288 - accuracy: 0.8051 - val_loss: 0.2966 - val_accuracy: 0.8710
Epoch 30/100
3/3 [==============================] - 0s 34ms/step - loss: 0.4221 - accuracy: 0.8088 - val_loss: 0.2919 - val_accuracy: 0.8710
Epoch 31/100
3/3 [==============================] - 0s 24ms/step - loss: 0.4221 - accuracy: 0.8125 - val_loss: 0.2903 - val_accuracy: 0.8710
Epoch 32/100
3/3 [==============================] - 0s 27ms/step - loss: 0.4183 - accuracy: 0.8199 - val_loss: 0.2904 - val_accuracy: 0.8710
Epoch 33/100
3/3 [==============================] - 0s 21ms/step - loss: 0.4140 - accuracy: 0.8125 - val_loss: 0.2911 - val_accuracy: 0.8710
Epoch 34/100
3/3 [==============================] - 0s 22ms/step - loss: 0.4096 - accuracy: 0.8125 - val_loss: 0.2907 - val_accuracy: 0.8710
Epoch 35/100
3/3 [==============================] - 0s 23ms/step - loss: 0.4060 - accuracy: 0.8088 - val_loss: 0.2894 - val_accuracy: 0.8710
Epoch 36/100
3/3 [==============================] - 0s 22ms/step - loss: 0.4027 - accuracy: 0.8162 - val_loss: 0.2871 - val_accuracy: 0.8710
Epoch 37/100
3/3 [==============================] - 0s 19ms/step - loss: 0.3997 - accuracy: 0.8162 - val_loss: 0.2836 - val_accuracy: 0.8710
Epoch 38/100
3/3 [==============================] - 0s 21ms/step - loss: 0.3953 - accuracy: 0.8235 - val_loss: 0.2810 - val_accuracy: 0.8710
Epoch 39/100
3/3 [==============================] - 0s 20ms/step - loss: 0.3929 - accuracy: 0.8309 - val_loss: 0.2774 - val_accuracy: 0.8710
Epoch 40/100
3/3 [==============================] - 0s 21ms/step - loss: 0.3911 - accuracy: 0.8235 - val_loss: 0.2736 - val_accuracy: 0.8710
Epoch 41/100
3/3 [==============================] - 0s 20ms/step - loss: 0.3872 - accuracy: 0.8272 - val_loss: 0.2731 - val_accuracy: 0.8710
Epoch 42/100
3/3 [==============================] - 0s 24ms/step - loss: 0.3861 - accuracy: 0.8272 - val_loss: 0.2766 - val_accuracy: 0.8710
Epoch 43/100
3/3 [==============================] - 0s 26ms/step - loss: 0.3901 - accuracy: 0.8235 - val_loss: 0.2805 - val_accuracy: 0.9032
Epoch 44/100
3/3 [==============================] - 0s 25ms/step - loss: 0.3913 - accuracy: 0.8235 - val_loss: 0.2732 - val_accuracy: 0.9032
Epoch 45/100
3/3 [==============================] - 0s 25ms/step - loss: 0.3836 - accuracy: 0.8309 - val_loss: 0.2665 - val_accuracy: 0.8710
Epoch 46/100
3/3 [==============================] - 0s 25ms/step - loss: 0.3770 - accuracy: 0.8272 - val_loss: 0.2651 - val_accuracy: 0.8710
Epoch 47/100
3/3 [==============================] - 0s 19ms/step - loss: 0.3741 - accuracy: 0.8272 - val_loss: 0.2652 - val_accuracy: 0.9032
Epoch 48/100
3/3 [==============================] - 0s 22ms/step - loss: 0.3716 - accuracy: 0.8346 - val_loss: 0.2661 - val_accuracy: 0.9032
Epoch 49/100
3/3 [==============================] - 0s 22ms/step - loss: 0.3692 - accuracy: 0.8346 - val_loss: 0.2658 - val_accuracy: 0.9032
Epoch 50/100
3/3 [==============================] - 0s 25ms/step - loss: 0.3665 - accuracy: 0.8346 - val_loss: 0.2639 - val_accuracy: 0.8710
Epoch 51/100
3/3 [==============================] - 0s 25ms/step - loss: 0.3633 - accuracy: 0.8346 - val_loss: 0.2621 - val_accuracy: 0.8387
Epoch 52/100
3/3 [==============================] - 0s 23ms/step - loss: 0.3635 - accuracy: 0.8272 - val_loss: 0.2614 - val_accuracy: 0.8387
Epoch 53/100
3/3 [==============================] - 0s 26ms/step - loss: 0.3612 - accuracy: 0.8346 - val_loss: 0.2608 - val_accuracy: 0.8387
Epoch 54/100
3/3 [==============================] - 0s 25ms/step - loss: 0.3577 - accuracy: 0.8346 - val_loss: 0.2613 - val_accuracy: 0.8710
Epoch 55/100
3/3 [==============================] - 0s 21ms/step - loss: 0.3544 - accuracy: 0.8419 - val_loss: 0.2625 - val_accuracy: 0.8710
Epoch 56/100
3/3 [==============================] - 0s 20ms/step - loss: 0.3533 - accuracy: 0.8382 - val_loss: 0.2694 - val_accuracy: 0.9032
Epoch 57/100
3/3 [==============================] - 0s 21ms/step - loss: 0.3579 - accuracy: 0.8346 - val_loss: 0.2776 - val_accuracy: 0.9032
Epoch 58/100
3/3 [==============================] - 0s 21ms/step - loss: 0.3582 - accuracy: 0.8419 - val_loss: 0.2741 - val_accuracy: 0.9032
Epoch 59/100
3/3 [==============================] - 0s 20ms/step - loss: 0.3503 - accuracy: 0.8382 - val_loss: 0.2702 - val_accuracy: 0.8710
Epoch 60/100
3/3 [==============================] - 0s 20ms/step - loss: 0.3456 - accuracy: 0.8456 - val_loss: 0.2694 - val_accuracy: 0.8710
Epoch 61/100
3/3 [==============================] - 0s 22ms/step - loss: 0.3442 - accuracy: 0.8456 - val_loss: 0.2715 - val_accuracy: 0.8710
Epoch 62/100
3/3 [==============================] - 0s 20ms/step - loss: 0.3457 - accuracy: 0.8529 - val_loss: 0.2726 - val_accuracy: 0.8710
Epoch 63/100
3/3 [==============================] - 0s 20ms/step - loss: 0.3433 - accuracy: 0.8529 - val_loss: 0.2720 - val_accuracy: 0.8710
Epoch 64/100
3/3 [==============================] - 0s 19ms/step - loss: 0.3387 - accuracy: 0.8493 - val_loss: 0.2718 - val_accuracy: 0.8710
Epoch 65/100
3/3 [==============================] - 0s 21ms/step - loss: 0.3332 - accuracy: 0.8382 - val_loss: 0.2753 - val_accuracy: 0.9032
Epoch 66/100
3/3 [==============================] - 0s 22ms/step - loss: 0.3311 - accuracy: 0.8456 - val_loss: 0.2795 - val_accuracy: 0.9032
Epoch 67/100
3/3 [==============================] - 0s 19ms/step - loss: 0.3290 - accuracy: 0.8529 - val_loss: 0.2776 - val_accuracy: 0.9032
Epoch 68/100
3/3 [==============================] - 0s 20ms/step - loss: 0.3240 - accuracy: 0.8529 - val_loss: 0.2758 - val_accuracy: 0.9032
Epoch 69/100
3/3 [==============================] - 0s 18ms/step - loss: 0.3220 - accuracy: 0.8566 - val_loss: 0.2764 - val_accuracy: 0.9032
Epoch 70/100
3/3 [==============================] - 0s 19ms/step - loss: 0.3203 - accuracy: 0.8603 - val_loss: 0.2774 - val_accuracy: 0.9032
Epoch 71/100
3/3 [==============================] - 0s 19ms/step - loss: 0.3177 - accuracy: 0.8603 - val_loss: 0.2775 - val_accuracy: 0.9032
Epoch 72/100
3/3 [==============================] - 0s 19ms/step - loss: 0.3127 - accuracy: 0.8603 - val_loss: 0.2802 - val_accuracy: 0.9032
Epoch 73/100
3/3 [==============================] - 0s 19ms/step - loss: 0.3109 - accuracy: 0.8676 - val_loss: 0.2860 - val_accuracy: 0.9032
Epoch 74/100
3/3 [==============================] - 0s 21ms/step - loss: 0.3115 - accuracy: 0.8603 - val_loss: 0.2897 - val_accuracy: 0.9032
Epoch 75/100
3/3 [==============================] - 0s 20ms/step - loss: 0.3122 - accuracy: 0.8603 - val_loss: 0.2909 - val_accuracy: 0.9032
Epoch 76/100
3/3 [==============================] - 0s 19ms/step - loss: 0.3085 - accuracy: 0.8640 - val_loss: 0.2871 - val_accuracy: 0.9032
Epoch 77/100
3/3 [==============================] - 0s 20ms/step - loss: 0.3025 - accuracy: 0.8676 - val_loss: 0.2846 - val_accuracy: 0.9032
Epoch 78/100
3/3 [==============================] - 0s 18ms/step - loss: 0.2996 - accuracy: 0.8750 - val_loss: 0.2878 - val_accuracy: 0.9032
Epoch 79/100
3/3 [==============================] - 0s 20ms/step - loss: 0.2968 - accuracy: 0.8713 - val_loss: 0.2926 - val_accuracy: 0.9032
Epoch 80/100
3/3 [==============================] - 0s 22ms/step - loss: 0.2946 - accuracy: 0.8713 - val_loss: 0.2948 - val_accuracy: 0.9032
Epoch 81/100
3/3 [==============================] - 0s 20ms/step - loss: 0.2930 - accuracy: 0.8750 - val_loss: 0.2952 - val_accuracy: 0.9032
Epoch 82/100
3/3 [==============================] - 0s 20ms/step - loss: 0.2908 - accuracy: 0.8750 - val_loss: 0.2988 - val_accuracy: 0.9032
Epoch 83/100
3/3 [==============================] - 0s 20ms/step - loss: 0.2883 - accuracy: 0.8713 - val_loss: 0.3004 - val_accuracy: 0.9032
Epoch 84/100
3/3 [==============================] - 0s 18ms/step - loss: 0.2864 - accuracy: 0.8787 - val_loss: 0.2977 - val_accuracy: 0.9032
Epoch 85/100
3/3 [==============================] - 0s 19ms/step - loss: 0.2887 - accuracy: 0.8750 - val_loss: 0.3012 - val_accuracy: 0.9032
Epoch 86/100
3/3 [==============================] - 0s 19ms/step - loss: 0.2885 - accuracy: 0.8676 - val_loss: 0.2970 - val_accuracy: 0.9032
Epoch 87/100
3/3 [==============================] - 0s 20ms/step - loss: 0.2825 - accuracy: 0.8787 - val_loss: 0.2944 - val_accuracy: 0.9032
Epoch 88/100
3/3 [==============================] - 0s 18ms/step - loss: 0.2781 - accuracy: 0.8860 - val_loss: 0.2947 - val_accuracy: 0.9032
Epoch 89/100
3/3 [==============================] - 0s 22ms/step - loss: 0.2765 - accuracy: 0.8897 - val_loss: 0.2953 - val_accuracy: 0.9032
Epoch 90/100
3/3 [==============================] - 0s 20ms/step - loss: 0.2760 - accuracy: 0.8860 - val_loss: 0.3000 - val_accuracy: 0.9032
Epoch 91/100
3/3 [==============================] - 0s 23ms/step - loss: 0.2740 - accuracy: 0.8860 - val_loss: 0.3071 - val_accuracy: 0.9032
Epoch 92/100
3/3 [==============================] - 0s 19ms/step - loss: 0.2714 - accuracy: 0.8860 - val_loss: 0.3139 - val_accuracy: 0.9032
Epoch 93/100
3/3 [==============================] - 0s 19ms/step - loss: 0.2698 - accuracy: 0.8860 - val_loss: 0.3157 - val_accuracy: 0.8710
Epoch 94/100
3/3 [==============================] - 0s 20ms/step - loss: 0.2707 - accuracy: 0.8897 - val_loss: 0.3160 - val_accuracy: 0.8710
Epoch 95/100
3/3 [==============================] - 0s 20ms/step - loss: 0.2684 - accuracy: 0.8860 - val_loss: 0.3093 - val_accuracy: 0.8710
Epoch 96/100
3/3 [==============================] - 0s 19ms/step - loss: 0.2648 - accuracy: 0.8897 - val_loss: 0.2961 - val_accuracy: 0.8710
Epoch 97/100
3/3 [==============================] - 0s 22ms/step - loss: 0.2641 - accuracy: 0.8971 - val_loss: 0.2897 - val_accuracy: 0.8710
Epoch 98/100
3/3 [==============================] - 0s 19ms/step - loss: 0.2598 - accuracy: 0.8971 - val_loss: 0.2922 - val_accuracy: 0.9032
Epoch 99/100
3/3 [==============================] - 0s 18ms/step - loss: 0.2621 - accuracy: 0.8860 - val_loss: 0.3007 - val_accuracy: 0.9032
Epoch 100/100
3/3 [==============================] - 0s 22ms/step - loss: 0.2615 - accuracy: 0.8787 - val_loss: 0.2992 - val_accuracy: 0.9032
六、模型评估
import matplotlib.pyplot as plt
acc = history.history['accuracy']
val_acc = history.history['val_accuracy']
loss = history.history['loss']
val_loss = history.history['val_loss']
epochs_range = range(epochs)
plt.figure(figsize=(14, 4))
plt.subplot(1, 2, 1)
plt.plot(epochs_range, acc, label='Training Accuracy')
plt.plot(epochs_range, val_acc, label='Validation Accuracy')
plt.legend(loc='lower right')
plt.title('Training and Validation Accuracy')
plt.subplot(1, 2, 2)
plt.plot(epochs_range, loss, label='Training Loss')
plt.plot(epochs_range, val_loss, label='Validation Loss')
plt.legend(loc='upper right')
plt.title('Training and Validation Loss')
plt.show()
scores = model.evaluate(X_test, y_test, verbose=0)
print("%s: %.2f%%" % (model.metrics_names[1], scores[1]*100))
accuracy: 90.32%